计算机科学 ›› 2025, Vol. 52 ›› Issue (7): 127-134.doi: 10.11896/jsjkx.240600090

• 计算机软件 • 上一篇    下一篇

双向特征图增强的图卷积网络算法

李梦茜1, 高心丹1, 李雪2   

  1. 1 东北林业大学计算机与控制工程学院 哈尔滨 150000
    2 哈尔滨工业大学计算学部 哈尔滨 150000
  • 收稿日期:2024-06-13 修回日期:2024-09-21 发布日期:2025-07-17
  • 通讯作者: 高心丹(gaoxd@aliyun.com)
  • 作者简介:(695875891@qq.com)
  • 基金资助:
    黑龙江省第二批“揭榜挂帅”科技攻关项目(2021ZXJ05A01-04)

Two-way Feature Augmentation Graph Convolution Networks Algorithm

LI Mengxi1, GAO Xindan1, LI Xue2   

  1. 1 College of Computer and Control Engineering, Northeast Forestry University, Harbin 150000, China
    2 Faculty of Computing, Harbin Institute of Technology, Harbin 150000, China
  • Received:2024-06-13 Revised:2024-09-21 Published:2025-07-17
  • About author:LI Mengxi,born in 1999,postgraduate.Her main research interest is graph neural network.
    GAO Xindan,born in 1970,Ph.D,associate professor,is a member of CCF(No.76616M). Her main research interest is remote sensing data processing and application.
  • Supported by:
    Second Batch of “Unveiling and Leading” Science and Technology Research Project of Heilongjiang Province(2021ZXJ05A01-04).

摘要: 图卷积神经网络算法在图结构数据的处理中起着至关重要的作用。现有图卷积网络的主流模式是基于拉普拉斯矩阵对节点特征进行加权求和,更侧重于对卷积聚合方式进行优化,忽略了图数据自身的先验信息。为充分挖掘图数据背后所蕴涵的丰富属性与结构信息,有效降低图数据中的噪音比例,提出双向特征图增强的图卷积网络算法,通过节点度和相似度计算增强图数据的拓扑空间特征和属性空间特征,然后将两种增强的图特征表示同时在拓扑空间和属性空间中传播,并利用注意力机制自适应融合学习到的嵌入。此外,针对深度图卷积神经网络易发生过平滑的问题,提出一种多输入残差结构,将初始残差和高阶邻域残差相结合,以实现模型在任意卷积层对初始特征和高阶邻域特征的均衡提取。在3个公共数据集上进行实验,结果表明该网络比现有网络具有更好的分类性能。

关键词: 图卷积网络, 图注意力网络, 图数据增强, 特征提取, 节点分类

Abstract: Graph convolutional neural network algorithms play a crucial role in the processing of graph structured data.The mainstream mode of existing graph convolutional networks is based on weighted summation of node features using Laplacian matrices,with a greater emphasis on optimizing the convolutional aggregation method and model structure,while ignoring the prior information of the graph data itself.To fully explore the rich attributes and structural information hidden behind graph data,and effectively reduce the proportion of noise in graph data,a bidirectional feature-enhanced graph convolutional network algorithm is proposed.The algorithm enhances the topological and attributes space features of graph data through node degree and similarity calculations,and then the two enhanced graph feature representations are propagated simultaneously in both topological and attribute spaces.The attention mechanism is used to adaptively fuse the learned embeddings.In addition,to address the issue of over-smoothing in deep graph convolutional neural networks,a multi-input residual structure is proposed,which combines initial resi-dual and high-order neighborhood residual to achieve balanced extraction of initial and high-order neighborhood features in any convolutional layers.Experiments are conducted on three public datasets,and the results show that using the proposed network achieves better classification performance than existing networks.

Key words: Graph convolutional network, Graph attention network, Graph data augmentation, Feature extraction, Node classification

中图分类号: 

  • TP391
[1]KIPF T N,WELLING M.Semi-supervised classification withgraph convolution network[J].arXiv:1609.02907,2016.
[2]CHEN B,LI J L.Attention-based network representation lear-ning model using multi-neighboring information[J].Journal of Chinese Computer Systems,2021,42(4):761-765.
[3]HUANG J,DU L,CHEN X,et al.Robust Mid-Pass Filtering Graph Convolutional Networks[J].arXiv:2302.08048,2023.
[4]HUANG Z,TANG Y,CHEN Y.A graph neural network-based node classification model on class imbalanced graph data[J].Knowledge-Based Systems,2022,244(23):108538.1-108538.12.
[5]ZHOU L E,YOU J G.Social Recommendation with Embedding of Summarized Graphs[J].Journal of Chinese Computer Systems,2021,42(1):78-84.
[6]YANG X,LIU Y,ZHOU S,et al.Cluster-guided ContrastiveGraph Clustering Network[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Washington:AAAI,2023:10834-10842.
[7]ZHANG L Y,SUN H H,SUN Y,et al.Review of Node Classification Methods Based on Graph Convolutional Neural Networks[J].Computer Science,2024,51(4):95-105.
[8]LEE N,LEE J,PARK C.Augmentation-free Self-supervisedLearning on Graphs[C]//Proceedings of the AAAI Conference on Artificial Intelligence.Palo Alto:AAAI,2022:7372-7380.
[9]HAN K,LAKSHMINARAYANNA B,LIU J.Reliable graphneural networks for drug discovery under distributional shift[J].arXiv:2111.12951,2021.
[10]BER AHMSNF K,NASIRI E,ROSTAMI M,et al.Amodified Deepwalk method for link prediction in attributed social network[J].Computing,2021,103(10):2227-2249.
[11]WANG S,PAN Y,ZHANG J,et al.Robust and label efficient bi-filtering graph convolutional networks for node classification[J].Knowledge-Based Systems,2021,224:106891.
[12]HOANG N T,MAEHARA T.Revisiting graph neural net-works:All we have is low-pass filters[J]. arXiv:1905.09550,2019.
[13]XIA W,WANG Q,GAO Q,et al.Self-consistent ContrastiveAttributed Graph Clustering with Pseudo-label Prompt[J].IEEE Transactions on Multimedia,2023,(25):6665-6677.
[14]WANG Y,WANG W,LIANG Y,et al. Nodeaug:Semi-supervised node classification with data augmentation[C]//Procee-dings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.New York:ACM,2020:207-217.
[15]FENG W,ZHANG J,DONG Y,et al.Graph random neural network for semi-supervised learning on graphs[C]//Proceedings of the International Conference on Neural Information Proces-sing Systems.Vancouver:Curran Press,2020:22092-22103.
[16]JOONHYUNG P,JAEYUN S.GraphENS:Neighbor-aware ego network synthesis for class-imbalanced node classification[C]//The 9th International Conference on Learning Representations.Virtual:OpenReview.net,2021.
[17]GUO X C,ZHANG W Y,XIA Z X.Two-way data augmentation graph convolutional networks[J].Computer Engineering and Design,2023,44(8):2345-2351.
[18]JIN W,DERR T,WANG Y,et al.Node similarity preserving graph convolutional networks[C]//Proceedings of the 14th ACM International Conference on Web Search and Data Mi-ning.Singapore:ACM,2021:148-156.
[19]HED X,LIANG C D,LIU H X,et al.Block modeling-guided graph convolutional neural networks[C]//Proceedings of AAAI Conference on Artificial Intelligence.Palo Alto:AAAI,2022:4022-4029.
[20]WANG X,ZHU M,BO D,et al.Am-gcn:Adaptive multi-channel graph convolutional networks[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Disco-very & Data Mining.New York:ACM,2020:1243-1253.
[21]LIU C,LIANG A T,LIU X Y.Social Network Nodes Classification Method Based on Multi-information Fusion[J].Journal of Frontiers of Computer Science and Technology,2023,17(9):2198-2208.
[22]CAO J X,XU W Z,JIN D,et al.Survey of Community Discovery in Complex Network[J].Computer Science,2023,50(S2):414-424.
[23]CAI T T,MA R.Theoretical foundations of t-SNE for visualizing high-dimensional clustered data[J].The Journal of Machine Learning Research,2022,23(1):13581-13634.
[24]MENG Z,LIANG S,BAO H,et al.Co-embedding attributednetworks[C]//Proceedings of the 12th ACM International Conference on Web Search and Data Mining.Melbourne:WSDM,2019:393-401.
[25]VelICKOVIC P,CURURULL G,CASSANOVA A,et al.Graph Attention Networks[J].arXiv:1710.10903,2017.
[26]ABU-El-HAJIA S,PEROZZI B,KAPOOR A,et al.Mixhop:higher-order graph convolutional architectures via sparsified neighborhood mixing[C]//Proceedings of the International Conference on Machine Learning.Vienna:ACM,2019:21-29.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!