计算机科学 ›› 2020, Vol. 47 ›› Issue (9): 52-59.doi: 10.11896/jsjkx.190300004

所属专题: 大数据&数据科学 虚拟专题

• 数据库&大数据&数据科学 • 上一篇    下一篇


丁钰, 魏浩, 潘志松, 刘鑫   

  1. 陆军工程大学指挥控制工程学院 南京210000
  • 收稿日期:2019-03-06 发布日期:2020-09-10
  • 通讯作者: 魏浩(594386708@qq.com)
  • 作者简介:yuding@live.com
  • 基金资助:

Survey of Network Representation Learning

DING Yu, WEI Hao, PAN Zhi-song, LIU Xin   

  1. Institute of Command and Control Engineering,Army Engineering University of PLA,Nanjing 210000,China
  • Received:2019-03-06 Published:2020-09-10
  • About author:DING Yu,born in 1989,doctorial student.His main research interests include artificial intelligence and network security.
    WEI Hao,born in 1990,Ph.D.His main research interests include complex network in machine learning,network embedding,online time series prediction.
  • Supported by:
    National Natural Science Foundation of China (61473149).

摘要: 网络是一系列节点和边的集合,通常表示成一个包含节点和边的图。许多复杂系统都以网络的形式来表示,如社交网络、生物网络和信息网络。为了使网络数据的处理变得简单有效,针对网络中节点的表示学习成为了近年来的研究热点。网络表示学习旨在为网络中的每个节点学习一个低维稠密的表示向量,进而可将得到的向量表示运用到常见的网络分析任务中,如节点聚类、节点分类和链路预测等。然而,绝大多数真实网络节点都有丰富的属性信息,如社交网络中的用户资料和引文网络中的文本内容。网络的属性信息对网络表示具有重要的作用,当网络高度稀疏时,网络的属性信息是网络表示重要的辅助信息,有助于更好地学习网络表示。传统的邻接矩阵仅仅表示了边的信息,而无法加入节点的属性信息。因此,网络表示不仅要保存网络的结构信息,还要保存网络的属性信息。此外,大多数真实世界网络都是动态变化的,这种变化包括网络节点的增加和减少,以及网络边的新建和消失。同时,与网络结构变化相似,网络中的属性也会随着时间的推移发生变化。随着机器学习技术的发展,针对网络表示学习问题的研究成果层出不穷,文中将针对近年来的网络表示学习方法进行系统性的介绍和总结。

关键词: 机器学习, 深度学习, 网络, 网络表示学习, 网络嵌入

Abstract: A network is a collection of nodes and edges,usually is represented as a graph.Many complex systems take the form of networks,such as social networks,biological networks,and information networks.In order to make network data processing simple and effective,the representation learning for nodes in the network has become a research hotspot in recent years.Network representation learning is designed to learn a low-dimensional dense representation vector for each node in the network that can advance various learning tasks in the network analysis area such as node classification,network clustering,and link prediction.However,most of previous works have been designed only for plain networks and ignore the node attributes.When the network is high sparsity,attributes can be the very useful complementary content to help learn better representations.Therefore,the network embedding should not only preserve the structural information,but also preserve the attribute information.In addition,in practical applications,many networks are dynamic and evolve over time with the addition,changing and deletion of nodes.Meanwhile,similar as network structure,node attributes also change naturally over time.With the development of machine learning,studies on the network representation learning emerge one after another.In this paper,we will systematically introduce and summarize the network representation learning methods in recent years.

Key words: Deep learning, Machine learning, Network embedding, Network representation learning, Networks


  • TP181
[1] JOLLIFFE I T.Pincipal Component Analysis[J].Journal ofMarketing Research,2002,25(4):513.
[2] ROWEIS S T,SAUL L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290:2323-2326.
[3] SAUL L K,ROWEIS S T.An introduction to locally linear embedding[J].Journal of Machine Learning Research,2008,7.
[4] BELKIN M,NIYOGI P.Laplacian eigenmaps and spectral techniques for embedding and clustering[C]//Proceedings of the 14th International Conference on Neural Information Processing Systems:Natural and Synthetic.Vancouver,2001:585-591.
[5] TANG L,LIU H.Leveraging social media networks for classification[J].Data Min Knowl Discov,2011,23:447-478.
[6] CHEN M,YANG Q,TANG X.Directed graph embedding[C]//Proceedings of the 20th International Joint Conference on Artifical Intelligence.Hyderabad,2007:2707-2712.
[7] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of Advances in Neural Information Proces-sing Systems.Lake Tahoe,2013:3111-3119.
[8] MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[J].arXiv:1301.3781.
[9] MIKOLOV T,KARAFIAT M,BURGET L,et al.Recurrentneural network based language model[C]//Proceedings of International Speech Communication Association.2010:1045-1048.
[10] PEROZZI B,AL-RFOU R,SKIENA S.Deepwalk:online learning of social representations[C]//The ACM SIGKDD International Conference.ACM,2014:701-710.
[11] GROVER A,LESKOVEC J.node2vec:scalable feature learning for networks[C]//The ACM SIGKDD International Conference.2016.
[12] CHENG W,GREAVES C,WARREN M.From n-gram to skip-gram to concgram[J].International Journal of Corpus Linguistics,2006,11(4):411-433.
[13] TANG J,QU M,WANG M,et al.Line:large-scale information network embedding[C]//International Conference on World Wide Web.2015:1067-1077.
[14] CAO S,LU W,XU Q.Grarep:Learning graph representations with global structural information[C]//KDD.2015.
[15] ZHANG Z,CUI P,WANG X,et al.Arbitrary-order proximity preserved network embedding[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2778-2786.
[16] OU M,CUI P,PEI J,et al.Asymmetric transitivity preserving graph embedding[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.San Francisco,2016,1:1105-1114.
[17] MA J,CUI P,WANG X,et al.Hierarchical taxonomy aware network embedding[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:1920-1929.
[18] WANG X,CUI P,WANG J,et al.Community Preserving Network Embedding[C]//AAAI.2017.
[19] CAVALLARI S,ZHENG V W,CAI H,et al.Learning community embedding with community detection and node embedding on graphs[C]//Proceedings of CIKM.2017.
[20] TU C C,ZENG X K,WANG H,et al.A Unified Framework for Community Detection and Network Representation Learning[J].arXiv:1611.06645.
[21] LE T M,LAUW H W.Probabilistic latent document network embedding[C]// 2014 IEEE International Conference on Data Mining (ICDM).IEEE,2014:270-279.
[22] CHANG J,BLEI D M.Relational topic models for document networks[C]//International Conference on Artificial Intelligence and Statistics.2009:81-88.
[23] YANG C,LIU Z,ZHAO D,et al.Network representation learning with rich text information[C]//International Conference on Artificial Intelligence.2015:2111-2117.
[24] NATARAJAN N,DHILLON I S.Inductive matrix completion for predicting gene-disease associations[J].Bioinformatics,2014,30(12):i60-i68.
[25] AHMED N K,ROSSI R A,ZHOU R,et al.Inductive Representation Learning in Large Attributed Graphs[J].arXiv:1710.09471v2,2017.
[26] AHMED N K,ROSSI R A,ZHOU R,et al.A Framework for Generalizing Graph-based Representation Learning Methods[J].arXiv:1709.04596,2017.
[27] NGUYEN D,MALLIAROS F D.BiasedWalk:Biased Sampling for Representation Learning on Graphs[J].arXiv:1809.02482,2018.
[28] TU C C,LIU H,LIU Z Y,et al.CANE:context-aware network embedding for relation modeling[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Vancouve,2017:1722-1731.
[29] PAN S,WU J,ZHU X,et al.Tri-Party Deep Network Representation[C]// The 25th International Joint Conference on Artificial Intelligence (IJCAI-2016).AAAI Press,2016.
[30] ZHU J,AHMED A,XING E P.Medlda:maximum margin supervised topic models[J].JMLR,2012,13(1):2237-2278.
[31] TU C C,ZHANG W C,LIU Z Y,et al.Max-Margin DeepWalk:discriminative learning of network representation[C]//Proceedings of International Joint Conference on Artificial Intelligence (IJCAI).2016.
[32] HEARST M A,DUMAIS S T,OSMAN E,et al.Support vector machines[J].IEEE Intelligent Systems and their Applications,1998,13(4):18-28.
[33] HUANG X,LI J,HU X.Label Informed Attributed Network Embedding[C]// Tenth Acm International Conference on Web Search & Data Mining.ACM,2017.
[34] BELKIN M,NIYOGI P.Laplacian eigenmaps and spectral techniques for embedding and clustering[J].Advances in Neural Information Processing Systems,2001,14(6):585-591.
[35] LIU J,HE Z C,WEI L,et al.Content to Node:Self-translation Network Embedding[C]//SIGKDD.2018.
[36] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation1997,9(8):1735-1780.
[37] WANG D X,CUI P,ZHU W W.Structural deep network embedding[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2016:1225-1234.
[38] ZHANG Z,YANG H X,BU J J,et al.ANRL:Attributed Network Representation Learning via Deep Neural Networks[C]//IJCAI.2018.
[39] KIPF T N,WELLING M.Semi-supervised classification withgraph convolutional networks[J].arXiv:1609.02907,2016.
[40] KIPF T N,WELLING M.Variational graph auto-encoders[J].arXiv:1611.07308,2016.
[41] KINGMA D P,WELLING M.Auto-encoding variational bayes[C]//Proceedings of the International Conference on Learning Representations (ICLR).2014.
[42] WANG S,TANG J,AGGARWAL C,et al.Signed network embedding in social media[C]//Proceedings of the 2017 SIAM International Conference on Data Mining.SIAM,2017:327-335.
[43] TU K,CUI P,WANG X,et al.Deep recursive network embedding with regular equivalence[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2357-2366.
[44] ZHU D,CUI P,WANG D,et al.Deep variational network embedding in wasserstein space[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2827-2836.
[45] VELIKOVI P,CUCURULL G,CASANOVA A,et al.Graph attention networks[J].arXiv:1710.10903,2017.
[46] HAMILTON W,YING Z,LESKOVEC J.Inductive representa-.tion learning on large graphs[C]//Advances in Neural Information Processing Systems.2017:1024-1034.
[47] KULKARNI V,AL-RFOU R,PEROZZI B,et al.Statistically Significant Detection of Linguistic Change[C]// Proceedings of the 24th International Conference on World Wide Web.ACM,2014:625-635.
[48] HAMILTON W L,LESKOVEC J,JURAFSKY D,et al.Diachronic word embeddings reveal statistical laws of semantic change[J].Meeting of the Association for Computational Linguistics,2016,1:1489-1501.
[49] LI J,DANI H,HU X,et al.Attributed Network Embedding for Learning in a Dynamic Environment[C]// Proceedings of the 2017 ACM on Conference on Information and Knowledge Management.ACM,2017:387-396.
[50] STEWART G W.Matrix Perturbation Theory[M]//MatrixPerturbation Theory.1990.
[51] HARDOON D R,SZEDMAK S,SHAWE-TAYLOR J.Canonical correlation analysis:An overview with application to learning methods[J].Neural computation,2004,16(12):2639-2664.
[52] GOYAL P,KAMRA N,HE X,et al.Dyngem:Deep embedding method for dynamic graphs[J].arXiv:1805.11273,2018.
[53] CHANG S,HAN W,TANG J,et al.Heterogeneous networkembedding via deep architectures[C]//Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Disco-very and Data Mining.ACM,2015:119-128.
[54] NGUYEN G H,LEE J B,ROSSI R A,et al.Continuous-time dynamic network embeddings[C]//Companion of the The Web Conference 2018 on The Web Conference 2018.2018:969-976.
[55] ZHOU L,YANG Y,REN X,et al.Dynamic Network Embedding by Modeling Triadic Closure Process[C]//AAAI.2018.
[56] LUN D,YUN W,GUOJIE S,et al.Dynamic Network Embedding:An Extended Approach for Skip-gram based Network Embedding[C]// IJCAI.2018.
[57] ZHU D,CUI P,ZHANG Z,et al.High-order Proximity Preserved Embedding For Dynamic Networks[J].IEEE Transactions on Knowledge & Data Engineering,2018,30(11):2132-2144.
[1] 张佳, 董守斌.
Cross-domain Recommendation Based on Review Aspect-level User Preference Transfer
计算机科学, 2022, 49(9): 41-47. https://doi.org/10.11896/jsjkx.220200131
[2] 周芳泉, 成卫青.
Sequence Recommendation Based on Global Enhanced Graph Neural Network
计算机科学, 2022, 49(9): 55-63. https://doi.org/10.11896/jsjkx.210700085
[3] 宋杰, 梁美玉, 薛哲, 杜军平, 寇菲菲.
Scientific Paper Heterogeneous Graph Node Representation Learning Method Based onUnsupervised Clustering Level
计算机科学, 2022, 49(9): 64-69. https://doi.org/10.11896/jsjkx.220500196
[4] 黄丽, 朱焱, 李春平.
Author’s Academic Behavior Prediction Based on Heterogeneous Network Representation Learning
计算机科学, 2022, 49(9): 76-82. https://doi.org/10.11896/jsjkx.210900078
[5] 郑文萍, 刘美麟, 杨贵.
Community Detection Algorithm Based on Node Stability and Neighbor Similarity
计算机科学, 2022, 49(9): 83-91. https://doi.org/10.11896/jsjkx.220400146
[6] 吕晓锋, 赵书良, 高恒达, 武永亮, 张宝奇.
Short Texts Feautre Enrichment Method Based on Heterogeneous Information Network
计算机科学, 2022, 49(9): 92-100. https://doi.org/10.11896/jsjkx.210700241
[7] 周乐员, 张剑华, 袁甜甜, 陈胜勇.
Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion
计算机科学, 2022, 49(9): 155-161. https://doi.org/10.11896/jsjkx.210800026
[8] 徐涌鑫, 赵俊峰, 王亚沙, 谢冰, 杨恺.
Temporal Knowledge Graph Representation Learning
计算机科学, 2022, 49(9): 162-171. https://doi.org/10.11896/jsjkx.220500204
[9] 饶志双, 贾真, 张凡, 李天瑞.
Key-Value Relational Memory Networks for Question Answering over Knowledge Graph
计算机科学, 2022, 49(9): 202-207. https://doi.org/10.11896/jsjkx.220300277
[10] 冷典典, 杜鹏, 陈建廷, 向阳.
Automated Container Terminal Oriented Travel Time Estimation of AGV
计算机科学, 2022, 49(9): 208-214. https://doi.org/10.11896/jsjkx.210700028
[11] 胡玉姣, 贾庆民, 孙庆爽, 谢人超, 黄韬.
Functional Architecture to Intelligent Computing Power Network
计算机科学, 2022, 49(9): 249-259. https://doi.org/10.11896/jsjkx.220500222
[12] 郭鹏军, 张泾周, 杨远帆, 阳申湘.
Study on Wireless Communication Network Architecture and Access Control Algorithm in Aircraft
计算机科学, 2022, 49(9): 268-274. https://doi.org/10.11896/jsjkx.210700220
[13] 宁晗阳, 马苗, 杨波, 刘士昌.
Research Progress and Analysis on Intelligent Cryptology
计算机科学, 2022, 49(9): 288-296. https://doi.org/10.11896/jsjkx.220300053
[14] 汤凌韬, 王迪, 张鲁飞, 刘盛云.
Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy
计算机科学, 2022, 49(9): 297-305. https://doi.org/10.11896/jsjkx.210800108
[15] 柳杰灵, 凌晓波, 张蕾, 王博, 王之梁, 李子木, 张辉, 杨家海, 吴程楠.
Network Security Risk Assessment Framework Based on Tactical Correlation
计算机科学, 2022, 49(9): 306-311. https://doi.org/10.11896/jsjkx.210600171
Full text



No Suggested Reading articles found!