计算机科学 ›› 2025, Vol. 52 ›› Issue (6): 167-178.doi: 10.11896/jsjkx.240600032

• 数据库&大数据&数据科学 • 上一篇    下一篇

基于多视图表示学习的语义感知异质图注意力网络

王静红1,2,3, 吴芝冰1, 王熙照4, 李昊康5   

  1. 1 河北师范大学计算机与网络空间安全学院 石家庄 050024
    2 河北省网络与信息安全重点实验室 石家庄 050024
    3 供应链大数据分析与数据安全河北省工程研究中心 石家庄 050024
    4 深圳大学计算机与软件学院 广东 深圳 518060
    5 河北工程技术学院人工智能与大数据学院 石家庄 050091
  • 收稿日期:2024-06-04 修回日期:2025-01-13 出版日期:2025-06-15 发布日期:2025-06-11
  • 通讯作者: 王静红(wangjinghong@126.com)
  • 基金资助:
    河北省自然科学基金(F2024205028,F2021205014);河北省高等学校科学技术研究项(ZD2022139)

Semantic-aware Heterogeneous Graph Attention Network Based on Multi-view RepresentationLearning

WANG Jinghong1,2,3, WU Zhibing1, WANG Xizhao4, LI Haokang5   

  1. 1 College of Computer and Cyber Security,Hebei Normal University,Shijiazhuang 050024,China
    2 Hebei Provincial Key Laboratory of Network and Information Security,Shijiazhuang 050024,China
    3 Hebei Provincial Engineering Research Center for Supply Chain Big Data Analytics & Data Security,Shijiazhuang 050024,China
    4 Department of Computer Science and Software Engineering,Shenzhen University,Shenzhen,Guangdong 518060,China
    5 Artificial Intelligence and Big Data College of Hebei University of Engineering and Technology,Shijiazhuang 050091,China
  • Received:2024-06-04 Revised:2025-01-13 Online:2025-06-15 Published:2025-06-11
  • About author:WANG Jinghong,born in 1967,Ph.D,professor,is a member of CCF(No.58341S).Her main research interests include artificial intelligence,pattern recognition,machine learning and data mining.
  • Supported by:
    Natural Science Foundation of Hebei Province(F2024205028,F2021205014)and Science and Technology Project of Hebei Education Department(ZD2022139).

摘要: 近年来,图神经网络因能够高效处理异质图中的复杂结构和丰富语义信息而受到了广泛的关注。学习异质图的低维节点嵌入,同时为节点分类、节点聚类等下游任务保留异质结构和语义,是一个关键且具有挑战性的问题。现有研究主要基于元路径来设计模型,但这种方法至少存在两方面的局限性:1)合适元路径的选择通常需要专家知识或额外的标注信息;2)该方法限制了模型按预定义的模式学习,从而难以充分捕获网络的复杂性。针对这些问题,提出了一种多视图和语义感知的异质图注意力网络(Multi-view and Semantic-aware Heterogeneous Graph Attention Network,MS-HGANN)。该网络无需人工设计元路径,即可融合节点和关系中的丰富语义信息。MS-HGANN主要包括3个部分:特征映射、二阶特定视图自我图融合和语义感知。特征映射将特征映射到统一的节点特征空间;二阶特定视图自我图融合设计了特定关系的编码器和节点注意力学习节点在局部结构上的表示;语义感知设计了两种相互协调的注意力机制来评估节点和关系的重要性,从而得到最终的节点表示。在3个公开数据集上进行实验,结果表明,所提模型在节点分类和聚类任务上达到了先进水平。

关键词: 图神经网络, 异质图, 图表示学习, 异质图嵌入, 异质网络

Abstract: In recent years,graph neural networks have received widespread attention for their ability to efficiently process complex structures and rich semantic information in heterogeneous graphs.Learning low-dimensional node embeddings of heterogeneous graphs while preserving the heterogeneous structure and semantics for downstream tasks such as node classification and node clustering is a critical and challenging problem.Existing studies mainly design models based on meta-paths,but this approach faces at least two limitations.1)The selection of suitable meta-paths usually requires expert knowledge or additional labelling information.2)The approach restricts the model from learning by predefined patterns,which makes it difficult to adequately capture the complexity of the network.To address these issues,a multi-view and semantic-aware heterogeneous graph attention network(MS-HGANN) is proposed to merge nodes and relationships without manually designing meta-paths with the MS-HGANN consists of three main components:feature mapping,second-order view-specific self-graph fusion,and semantic aware.Feature mapping maps features to a uniform node feature space.Second-order view-specific self-graph fusion designs relationship-specific encoders and node attention to learn node representations on local structures.Semantic aware designs two coordinated attention mechanisms to evaluate the importance of nodes and relationships to obtain the final node representations.Experimental results on three publicly available datasets show that the proposed model is state-of-the-art for node classification and clustering tasks.

Key words: Graph neural networks, Heterogeneous graphs, Graph representation learning, Heterogeneous graph embedding, Heterogeneous networks

中图分类号: 

  • TP391
[1]SUN Y,HAN J.Mining heterogeneous information networks:a structural analysis approach[J].ACM SIGKDD Explorations Newsletter,2013,14(2):20-28.
[2]ATWOOD J,TOWSLEY D.Diffusion-convolutional neural networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems.New York:Curran Associates Inc,2016:2001-2009.
[3]ZHAO M,JIA A L.DAHGN:Degree-Aware HeterogeneousGraph Neural Network[J].Knowledge-Based Systems,2024,285:111355.
[4]KIPF T,WELLING M.Semi-supervised classification withgraph convolutional networks[EB/OL].(2017-02-09) [2024-03-11].https://arxiv.org/abs/1609.02907.
[5]WENG L J,ZHANG Q H,LIN Z B,et al.Harnessing heteroge-neous social networks for better recommendations:A grey relational analysis approach[J].Expert Systems with Applications,2021,174:114771.
[6]WANG D X,CUI P,ZHU W.Structural Deep Network Embedding[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.New York:Association for Computing Machinery,2016:1225-1234.
[7]BERG R,KIPF T N,WELLING M.Graph convolutional matrix completion[EB/OL].(2017-06-07) [2024-03-11].https://arxiv.org/abs/1706.02263.
[8]ZHANG J,SHI X,ZHAO S,et al.STAR-GCN:Stacked and Re-constructed Graph Convolutional Networks for Recommender Systems[C]//Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence.Hawaii:AAAI Press,2019:4264-4270.
[9]FOUT A,BYRD J,SHARIAT B,et al.Protein interface predic-tion using graph convolutional networks[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York:Curran Associates Inc,2017:6533-6542.
[10]ZITNIK M,AGRAWAL M,LESKOVEC J.Modeling polypharmacy side effects with graph convolutional networks[J].Bioinformatics,2018,34(13):i457-i466.
[11]LI Y,YU R,SHAHABI C,et al.Diffusion convolutional recurrent neural network:data-driven traffic forecasting[EB/OL].(2017-07-06)[2024-03-11].https://arxiv.org/abs/1707.01926.
[12]ZHANG J,SHI X,XIE J,et al.Gaan:gated attention networks for learning on large and spatiotemporal graphs[EB/OL].(2018-03-20) [2024-03-11].https://arxiv.org/abs/1803.07294.
[13]DONG Y,CHAWLA N,SWAMI A,et al.Metapath2vec:scalable representation learning for heterogeneous networks[C]//Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.New York:Association for Computing Machinery,2017:135-144.
[14]FU T Y,LEE W C,LEI Z.HIN2Vec:explore meta-paths in heterogeneous information networks for representation learning[C]//Proceedings of the 2017 ACM on Conference on Information and Knowledge Management.New York:Association for Computing Machinery,2017:1797-1806.
[15]ZHANG W,FANG Y,LIU Z,et al.Mg2vec:learning relationship-preserving heterogeneous graph representations via metagraph embedding[J].IEEE Transactions on Knowledge and Data Engineering,2020,34(3),1317-1329.
[16]ZHAO K,BAI T,WU B,et al.Deep adversarial completion for sparse heterogeneous information network embedding[C]//Proceedings of The Web Conference 2020.New York:Association for Computing Machinery,2020:508-518.
[17]HU Z,DONG Y,WANG K,et al.Heterogeneous graph Trans-former[C]//Proceedings of The Web Conference 2020.New York:Association for Computing Machinery,2020:2704-2710.
[18]VASHISHTH S,SANYAL S,NITIN V,et al.Composition-based multi-relational graph convolutional networks[EB/OL].(2019-11-08) [2024-03-11].https://arxiv.org/abs/1911.03082.
[19]YANG Y,GUAN Z,LI J,et al.Interpretable and efficient heterogeneous graph convolutional network[J].IEEE Transactions on Knowledge and Data Engineering,2021,35(2):1637-1650.
[20]WANG X,JI H,SHI C,et al.Heterogeneous graph attention network[C]//The World Wide Web Conference.New York:Association for Computing Machinery,2019:2022-2032.
[21]FU X,ZHANG J,MENG Z,et al.MAGNN:metapath aggregated graph neural network for heterogeneous graph embedding[C]//Proceedings of the Web Conference 2020.New York:Association for Computing Machinery,2020:2331-2341.
[22]SHAO Z,XU Y,WEI W,et al.Heterogeneous Graph NeuralNetwork With Multi-View Representation Learning[J].IEEE Transactions on Knowledge & Data Engineering,2023,35(11):11476-11488.
[23]LV Q,DING M,LIU Q,et al.Are we really making much progress? Revisiting,benchmarking and refining heterogeneous graph neural networks[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Data Mining.New York:Association for Computing Machinery,2021:1150-1160.
[24]VELIČKOVIĆ P,CUCURULL G,CASANOVA A,et al.Graph attention networks[EB/OL].(2017-10-30) [2024-03-11].https://arxiv.org/abs/1710.10903.
[25]LI W,NI L,WANG J,et al.Collaborative representation lear-ning for nodes and relations via heterogeneous graph neural network[J].Knowledge-Based Systems,2022,255:109673.
[26]YANG X,YAN M,PAN S,et al.Simple and efficient heterogeneous graph neural network[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2023:10816-10824.
[27]HWANG H J,KIM G H,HONG S,et al.Multi-view representation learning via total correlation objective[J].Advances in Neural Information Processing Systems,2021,34:12194-12207.
[28]TANG S,LI B,YU H.ChebNet:Efficient and stable constructions of deep neural networks with rectified power units using chebyshev approximations[EB/OL].(2019-11-07)[2024-03-11].https://arxiv.org/abs/1911.05467.
[29]LEVIE R,MONTI F,BRESSON X,et al.Cayleynets:graphconvolutional neural networks with complex rational spectral filters[J].IEEE Transactions on Signal Processing,2018,67(1):97-109.
[30]BRUNA J,ZAREMBA W,SZLAM A,et al.Spectral networks and locally Connected Networks on Graphs[EB/OL].(2013-12-21) [2024-03-11].https://arxiv.org/abs/1312.6203.
[31]DEFFERRARD M,BRESSON X,VANDERGHEYNST P.Convolutional neural networks on graphs with fast localized spectral filtering[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems.New York:Curran Associates Inc,2016:3844-3852.
[31]CHEN M,WEI Z,HUANG Z,et al.Simple and deep graph convolutional networks[C]//Proceedings of the 37th International Conference on Machine Learning.Online:JMLR.org,2020:1725-1735.
[32]HAMILTON W L,YING Z,LESKOVEC J.Inductive representation learning on large graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York:Curran Associates.2017:1025-1035.
[33]PANG B,FU Y,REN S,et al.CGNN:traffic classification with graph neural network[EB/OL].(2021-10-19) [2024-03-11].https://arxiv.org/abs/2110.09726.
[34]WANG X,LIU N,HAN H,et al.Self-supervised Heterogeneous graph Neural Network with Co-contrastive Learning[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Data Mining.New York:Association for Computing Machinery,2021:1726-1736.
[35]YUN S,JEONG M,KIM R,et al.Graph transformer networks[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems.New York:Curran Associates Inc,2019:11983-11993.
[36]SCHLICHTKRULL M,KIPF T N,BLOEM P,et al.Modeling relational data with graph convolutional networks[C]////European Semantic Web Conference.Berlin:Springer-Verlag,2018:593-607.
[37]ZHANG C,SONG D,HUANG C,et al.Heterogeneous graph neural network[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.New York:Association for Computing Machinery,2019:793-803.
[38]BORDES A,USUNIER N,GARCIA-DURÁN A,et al.Translating embeddings for modeling multi-relational data[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems-Volume 2.New York:Curran Associates Inc,2013:2787-2795.
[39]AN DER MAATEN L,HINTON G.Visualizing data usingt-SNE[J].Journal of Machine Learning Research,2008,9:2579-2605.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!