Computer Science ›› 2020, Vol. 47 ›› Issue (9): 52-59.doi: 10.11896/jsjkx.190300004

• Database & Big Data & Data Science • Previous Articles     Next Articles

Survey of Network Representation Learning

DING Yu, WEI Hao, PAN Zhi-song, LIU Xin   

  1. Institute of Command and Control Engineering,Army Engineering University of PLA,Nanjing 210000,China
  • Received:2019-03-06 Published:2020-09-10
  • About author:DING Yu,born in 1989,doctorial student.His main research interests include artificial intelligence and network security.
    WEI Hao,born in 1990,Ph.D.His main research interests include complex network in machine learning,network embedding,online time series prediction.
  • Supported by:
    National Natural Science Foundation of China (61473149).

Abstract: A network is a collection of nodes and edges,usually is represented as a graph.Many complex systems take the form of networks,such as social networks,biological networks,and information networks.In order to make network data processing simple and effective,the representation learning for nodes in the network has become a research hotspot in recent years.Network representation learning is designed to learn a low-dimensional dense representation vector for each node in the network that can advance various learning tasks in the network analysis area such as node classification,network clustering,and link prediction.However,most of previous works have been designed only for plain networks and ignore the node attributes.When the network is high sparsity,attributes can be the very useful complementary content to help learn better representations.Therefore,the network embedding should not only preserve the structural information,but also preserve the attribute information.In addition,in practical applications,many networks are dynamic and evolve over time with the addition,changing and deletion of nodes.Meanwhile,similar as network structure,node attributes also change naturally over time.With the development of machine learning,studies on the network representation learning emerge one after another.In this paper,we will systematically introduce and summarize the network representation learning methods in recent years.

Key words: Networks, Network representation learning, Machine learning, Network embedding, Deep learning

CLC Number: 

  • TP181
[1] JOLLIFFE I T.Pincipal Component Analysis[J].Journal ofMarketing Research,2002,25(4):513.
[2] ROWEIS S T,SAUL L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290:2323-2326.
[3] SAUL L K,ROWEIS S T.An introduction to locally linear embedding[J].Journal of Machine Learning Research,2008,7.
[4] BELKIN M,NIYOGI P.Laplacian eigenmaps and spectral techniques for embedding and clustering[C]//Proceedings of the 14th International Conference on Neural Information Processing Systems:Natural and Synthetic.Vancouver,2001:585-591.
[5] TANG L,LIU H.Leveraging social media networks for classification[J].Data Min Knowl Discov,2011,23:447-478.
[6] CHEN M,YANG Q,TANG X.Directed graph embedding[C]//Proceedings of the 20th International Joint Conference on Artifical Intelligence.Hyderabad,2007:2707-2712.
[7] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of Advances in Neural Information Proces-sing Systems.Lake Tahoe,2013:3111-3119.
[8] MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[J].arXiv:1301.3781.
[9] MIKOLOV T,KARAFIAT M,BURGET L,et al.Recurrentneural network based language model[C]//Proceedings of International Speech Communication Association.2010:1045-1048.
[10] PEROZZI B,AL-RFOU R,SKIENA S.Deepwalk:online learning of social representations[C]//The ACM SIGKDD International Conference.ACM,2014:701-710.
[11] GROVER A,LESKOVEC J.node2vec:scalable feature learning for networks[C]//The ACM SIGKDD International Conference.2016.
[12] CHENG W,GREAVES C,WARREN M.From n-gram to skip-gram to concgram[J].International Journal of Corpus Linguistics,2006,11(4):411-433.
[13] TANG J,QU M,WANG M,et al.Line:large-scale information network embedding[C]//International Conference on World Wide Web.2015:1067-1077.
[14] CAO S,LU W,XU Q.Grarep:Learning graph representations with global structural information[C]//KDD.2015.
[15] ZHANG Z,CUI P,WANG X,et al.Arbitrary-order proximity preserved network embedding[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2778-2786.
[16] OU M,CUI P,PEI J,et al.Asymmetric transitivity preserving graph embedding[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.San Francisco,2016,1:1105-1114.
[17] MA J,CUI P,WANG X,et al.Hierarchical taxonomy aware network embedding[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:1920-1929.
[18] WANG X,CUI P,WANG J,et al.Community Preserving Network Embedding[C]//AAAI.2017.
[19] CAVALLARI S,ZHENG V W,CAI H,et al.Learning community embedding with community detection and node embedding on graphs[C]//Proceedings of CIKM.2017.
[20] TU C C,ZENG X K,WANG H,et al.A Unified Framework for Community Detection and Network Representation Learning[J].arXiv:1611.06645.
[21] LE T M,LAUW H W.Probabilistic latent document network embedding[C]// 2014 IEEE International Conference on Data Mining (ICDM).IEEE,2014:270-279.
[22] CHANG J,BLEI D M.Relational topic models for document networks[C]//International Conference on Artificial Intelligence and Statistics.2009:81-88.
[23] YANG C,LIU Z,ZHAO D,et al.Network representation learning with rich text information[C]//International Conference on Artificial Intelligence.2015:2111-2117.
[24] NATARAJAN N,DHILLON I S.Inductive matrix completion for predicting gene-disease associations[J].Bioinformatics,2014,30(12):i60-i68.
[25] AHMED N K,ROSSI R A,ZHOU R,et al.Inductive Representation Learning in Large Attributed Graphs[J].arXiv:1710.09471v2,2017.
[26] AHMED N K,ROSSI R A,ZHOU R,et al.A Framework for Generalizing Graph-based Representation Learning Methods[J].arXiv:1709.04596,2017.
[27] NGUYEN D,MALLIAROS F D.BiasedWalk:Biased Sampling for Representation Learning on Graphs[J].arXiv:1809.02482,2018.
[28] TU C C,LIU H,LIU Z Y,et al.CANE:context-aware network embedding for relation modeling[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.Vancouve,2017:1722-1731.
[29] PAN S,WU J,ZHU X,et al.Tri-Party Deep Network Representation[C]// The 25th International Joint Conference on Artificial Intelligence (IJCAI-2016).AAAI Press,2016.
[30] ZHU J,AHMED A,XING E P.Medlda:maximum margin supervised topic models[J].JMLR,2012,13(1):2237-2278.
[31] TU C C,ZHANG W C,LIU Z Y,et al.Max-Margin DeepWalk:discriminative learning of network representation[C]//Proceedings of International Joint Conference on Artificial Intelligence (IJCAI).2016.
[32] HEARST M A,DUMAIS S T,OSMAN E,et al.Support vector machines[J].IEEE Intelligent Systems and their Applications,1998,13(4):18-28.
[33] HUANG X,LI J,HU X.Label Informed Attributed Network Embedding[C]// Tenth Acm International Conference on Web Search & Data Mining.ACM,2017.
[34] BELKIN M,NIYOGI P.Laplacian eigenmaps and spectral techniques for embedding and clustering[J].Advances in Neural Information Processing Systems,2001,14(6):585-591.
[35] LIU J,HE Z C,WEI L,et al.Content to Node:Self-translation Network Embedding[C]//SIGKDD.2018.
[36] HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation1997,9(8):1735-1780.
[37] WANG D X,CUI P,ZHU W W.Structural deep network embedding[C]//Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.ACM,2016:1225-1234.
[38] ZHANG Z,YANG H X,BU J J,et al.ANRL:Attributed Network Representation Learning via Deep Neural Networks[C]//IJCAI.2018.
[39] KIPF T N,WELLING M.Semi-supervised classification withgraph convolutional networks[J].arXiv:1609.02907,2016.
[40] KIPF T N,WELLING M.Variational graph auto-encoders[J].arXiv:1611.07308,2016.
[41] KINGMA D P,WELLING M.Auto-encoding variational bayes[C]//Proceedings of the International Conference on Learning Representations (ICLR).2014.
[42] WANG S,TANG J,AGGARWAL C,et al.Signed network embedding in social media[C]//Proceedings of the 2017 SIAM International Conference on Data Mining.SIAM,2017:327-335.
[43] TU K,CUI P,WANG X,et al.Deep recursive network embedding with regular equivalence[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2357-2366.
[44] ZHU D,CUI P,WANG D,et al.Deep variational network embedding in wasserstein space[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.ACM,2018:2827-2836.
[45] VELIKOVI P,CUCURULL G,CASANOVA A,et al.Graph attention networks[J].arXiv:1710.10903,2017.
[46] HAMILTON W,YING Z,LESKOVEC J.Inductive representa-.tion learning on large graphs[C]//Advances in Neural Information Processing Systems.2017:1024-1034.
[47] KULKARNI V,AL-RFOU R,PEROZZI B,et al.Statistically Significant Detection of Linguistic Change[C]// Proceedings of the 24th International Conference on World Wide Web.ACM,2014:625-635.
[48] HAMILTON W L,LESKOVEC J,JURAFSKY D,et al.Diachronic word embeddings reveal statistical laws of semantic change[J].Meeting of the Association for Computational Linguistics,2016,1:1489-1501.
[49] LI J,DANI H,HU X,et al.Attributed Network Embedding for Learning in a Dynamic Environment[C]// Proceedings of the 2017 ACM on Conference on Information and Knowledge Management.ACM,2017:387-396.
[50] STEWART G W.Matrix Perturbation Theory[M]//MatrixPerturbation Theory.1990.
[51] HARDOON D R,SZEDMAK S,SHAWE-TAYLOR J.Canonical correlation analysis:An overview with application to learning methods[J].Neural computation,2004,16(12):2639-2664.
[52] GOYAL P,KAMRA N,HE X,et al.Dyngem:Deep embedding method for dynamic graphs[J].arXiv:1805.11273,2018.
[53] CHANG S,HAN W,TANG J,et al.Heterogeneous networkembedding via deep architectures[C]//Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Disco-very and Data Mining.ACM,2015:119-128.
[54] NGUYEN G H,LEE J B,ROSSI R A,et al.Continuous-time dynamic network embeddings[C]//Companion of the The Web Conference 2018 on The Web Conference 2018.2018:969-976.
[55] ZHOU L,YANG Y,REN X,et al.Dynamic Network Embedding by Modeling Triadic Closure Process[C]//AAAI.2018.
[56] LUN D,YUN W,GUOJIE S,et al.Dynamic Network Embedding:An Extended Approach for Skip-gram based Network Embedding[C]// IJCAI.2018.
[57] ZHU D,CUI P,ZHANG Z,et al.High-order Proximity Preserved Embedding For Dynamic Networks[J].IEEE Transactions on Knowledge & Data Engineering,2018,30(11):2132-2144.
[1] ZHANG Yang, MA Xiao-hu. Anime Character Portrait Generation Algorithm Based on Improved Generative Adversarial Networks [J]. Computer Science, 2021, 48(1): 182-189.
[2] WANG Rui-ping, JIA Zhen, LIU Chang, CHEN Ze-wei, LI Tian-rui. Deep Interest Factorization Machine Network Based on DeepFM [J]. Computer Science, 2021, 48(1): 226-232.
[3] YU Wen-jia, DING Shi-fei. Conditional Generative Adversarial Network Based on Self-attention Mechanism [J]. Computer Science, 2021, 48(1): 241-246.
[4] TONG Xin, WANG Bin-jun, WANG Run-zheng, PAN Xiao-qin. Survey on Adversarial Sample of Deep Learning Towards Natural Language Processing [J]. Computer Science, 2021, 48(1): 258-267.
[5] LI Yin, LI Bi-xin. Memory Leak Test Acceleration Based on Script Prediction and Reconstruction [J]. Computer Science, 2020, 47(9): 31-39.
[6] MA Li-bo, QIN Xiao-lin. Topic-Location-Category Aware Point-of-interest Recommendation [J]. Computer Science, 2020, 47(9): 81-87.
[7] ZHUANG Shi-jie, YU Zhi-yong, GUO Wen-zhong, HUANG Fang-wan. Short Term Load Forecasting via Zoneout-based Multi-time Scale Recurrent Neural Network [J]. Computer Science, 2020, 47(9): 105-109.
[8] HE Xin, XU Juan, JIN Ying-ying. Action-related Network:Towards Modeling Complete Changeable Action [J]. Computer Science, 2020, 47(9): 123-128.
[9] YE Ya-nan, CHI Jing, YU Zhi-ping, ZHAN Yu-liand ZHANG Cai-ming. Expression Animation Synthesis Based on Improved CycleGan Model and Region Segmentation [J]. Computer Science, 2020, 47(9): 142-149.
[10] ZHAO Qin-yan, LI Zong-min, LIU Yu-jie, LI Hua. Cascaded Siamese Network Visual Tracking Based on Information Entropy [J]. Computer Science, 2020, 47(9): 157-162.
[11] DENG Liang, XU Geng-lin, LI Meng-jie, CHEN Zhang-jin. Fast Face Recognition Based on Deep Learning and Multiple Hash Similarity Weighting [J]. Computer Science, 2020, 47(9): 163-168.
[12] SU Chang, ZHANG Ding-quan, XIE Xian-zhong, TAN Ya. NFV Memory Resource Management in 5G Communication Network [J]. Computer Science, 2020, 47(9): 246-251.
[13] ZHU Guo-hui, ZHANG Yin, LIU Xiu-xia, SUN Tian-ao. Energy Efficient Virtual Network Mapping Algorithms Based on Node Topology Awareness [J]. Computer Science, 2020, 47(9): 270-274.
[14] BAO Yu-xuan, LU Tian-liang, DU Yan-hui. Overview of Deepfake Video Detection Technology [J]. Computer Science, 2020, 47(9): 283-292.
[15] MENG Li-sha, REN Kun, FAN Chun-qi, HUANG Long. Dense Convolution Generative Adversarial Networks Based Image Inpainting [J]. Computer Science, 2020, 47(8): 202-207.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] ZHANG Jia-nan and XIAO Ming-yu. Approximation Algorithm for Weighted Mixed Domination Problem[J]. Computer Science, 2018, 45(4): 83 -88 .
[2] WU Jian-hui, HUANG Zhong-xiang, LI Wu, WU Jian-hui, PENG Xin and ZHANG Sheng. Robustness Optimization of Sequence Decision in Urban Road Construction[J]. Computer Science, 2018, 45(4): 89 -93 .
[3] SHI Wen-jun, WU Ji-gang and LUO Yu-chun. Fast and Efficient Scheduling Algorithms for Mobile Cloud Offloading[J]. Computer Science, 2018, 45(4): 94 -99 .
[4] ZHOU Yan-ping and YE Qiao-lin. L1-norm Distance Based Least Squares Twin Support Vector Machine[J]. Computer Science, 2018, 45(4): 100 -105 .
[5] HAN Kui-kui, XIE Zai-peng and LV Xin. Fog Computing Task Scheduling Strategy Based on Improved Genetic Algorithm[J]. Computer Science, 2018, 45(4): 137 -142 .
[6] ZHENG Xiu-lin, SONG Hai-yan and FU Yi-peng. Distinguishing Attack of MORUS-1280-128[J]. Computer Science, 2018, 45(4): 152 -156 .
[7] RAN Zheng, LUO Lei, YAN Hua and LI Yun. Study on Automatic Method for AUTOSAR Runnable Entity-task Mapping[J]. Computer Science, 2018, 45(4): 190 -195 .
[8] GUO Jun-xia, GUO Ren-fei, XU Nan-shan and ZHAO Rui-lian. Study on Construction of EFSM Model for Web Application Based on Session[J]. Computer Science, 2018, 45(4): 203 -207 .
[9] ZHANG Jing and ZHU Guo-bin. Hot Topic Discovery Research of Stack Overflow Programming Website Based on CBOW-LDA Topic Model[J]. Computer Science, 2018, 45(4): 208 -214 .
[10] WEN Jun-hao, SUN Guang-hui and LI Shun. Study on Matrix Factorization Recommendation Algorithm Based on User Clustering and Mobile Context[J]. Computer Science, 2018, 45(4): 215 -219 .