Computer Science ›› 2022, Vol. 49 ›› Issue (4): 43-48.doi: 10.11896/jsjkx.210800276

• Special Issue of Social Computing Based Interdisciplinary Integration • Previous Articles     Next Articles

Link Prediction for Node Featureless Networks Based on Faster Attention Mechanism

LI Yong, WU Jing-peng, ZHANG Zhong-ying, ZHANG Qiang   

  1. College of Computer Science and Engineering, Northwest Normal University, Lanzhou 730070, China
  • Received:2021-08-31 Revised:2021-12-12 Published:2022-04-01
  • About author:LI Yong,born in 1979,Ph.D,associate professor,is a member of China Computer Federation.His main research interests include social computing and data analytics.WU Jing-peng,born in 1998,postgra-duate.His main research interests include machine learning,link prediction in complex networks and graph neural networks.
  • Supported by:
    This work was supported by the National Natural Science Foundation of China(72161034,61863032),Major Scientific Research Projects of Northwest Normal University(NWNU-LKZD2021-06),Research Project on Association of Fundamental Computing Education in Chinese Universities(2020-AFCEC-355) and Research Project on Educational Science Planning of Gansu(GS[2018]GHBBKZ021).

Abstract: Link prediction is an important task in network science.It aims to predict the link existence probabilities of two nodes.There are many relations between substances in real word, which can be described by network science in computers.There are many problems of daily life, which can be transformed to link prediction tasks.Link prediction algorithms for node featureless networks are convenient to migrate in directed networks, weighted networks, time networks, and so on.However, the traditional link prediction algorithms are faced with many problems as follows.The network structures information mining is not deep enough.The feature extraction processes depend on subjective consciousness.The algorithms are short of universality, and the time complexity and space complexity are flawed, which cause that they are difficult to be applied to real industry networks.In order to effectively avoid the above problems, based on the basic structure of graph attention network, graph embedding representation technology is used to collect node characteristics, analogy with the memory addressing strategy in neural turing machine, and combined with the relevant work of important node discovery in complex network, a fast and efficient attention calculation method is designed, and a node featureless network link prediction algorithm FALP integrating fast attention mechanism is proposed.Experiment on three public datasets and a private dataset show that the FALP effectively avoids these problems and has excellent predictive performance.

Key words: Attention mechanism, Graph embedding, Graph neural networks (GNNs), Link prediction, Network science

CLC Number: 

  • TP312
[1] LI M,MENG X M,ZHENG F X,et al.Identification of Protein Complexes by Using a Spatial and Temporal Active Protein Interaction Network[J].IEEE/ACM Transactions on Computational Biology and Bioinformatics,2020,17(3):817-827.
[2] YASUNAGA M,KASAI J,ZHANG R,et al.A Large Annotated Corpus and Content-Impact Models for Scientific Paper Summarization with Citation Networks[C]//The AAAI Confe-rence on Artificial Intelligence.2019:7386-7393.
[3] MANGIONI G,JURMAN G,DEDOMENICO M.MultilayerFlows in Molecular Networks Identify Biological Modules in the Human Proteome[J].IEEE Transactions on Network Science and Engineering,2018,7(1):411-420.
[4] ZHOU F,YANG Q,ZHONG T,et al.Variational Graph Neural Networks for Road Traffic Prediction in Intelligent Transportation Systems[J].IEEE Transactions on Industrial Informatics,2021,17(4):2802-2812.
[5] LUO J W,LONG Y H.NTSHMDA:Prediction of Human Mi-crobe-Disease Association Based on Random Walk by Integrating Network Topological Similarity[J].IEEE/ACM Transactions on Computational Biology and Bioinformatics,2020,17(4):1341-1351.
[6] CALDERONI F,CATANESE S,MEOP D,et al.Robust linkprediction in criminal networks:A case study of the Sicilian Mafia[J].Expert Systems with Applications,2020,161(3):113666.
[7] HWANG D,YANG S,KWON Y,et al.Comprehensive Study onMolecular Supervised Learning with Graph Neural Networks[J].Journal of Chemical Information and Modeling,2020,60(12):5936-5945.
[8] MOLZAHND K,WANG J.Detection and Characterization ofIntrusions to Network Parameter Data in Electric Power Systems[J].IEEE Transactions on Smart Grid,2019,10(4):3919-3928.
[9] WU S W,ZHANG W T,SUN F,et al.Graph Neural Networks in Recommender Systems:A Survey [EB/OL].(2020-11-04)[2021-08-30].https://arxiv.org/abs/2011.02260.
[10] KUMAR A,SING H S,SINGH K,et al.Link prediction techniques,applications,and performance:A survey[J].Physica A:Statistical Mechanics and its Applications,2020,553(6):124289.
[11] ZHANG M H,CHEN Y X.Link Prediction Based on GraphNeural Networks[C]//The Advances in Neural Information Processing Systems 31 (NeurIPS).2018:5165-5175.
[12] ZHANG M,LIANG Y Y,HUANG X J.Link prediction and analysis of formation mechanism of complex networks based on ensemble learning[J].Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition),2020,32(5):759-768.
[13] GOYAL P,FERRARAE.Graph Embedding Techniques,Applications,and Performance:A Survey[J].Knowledge-Based Systems,2017,151:78-94.
[14] PEROZZI B,AL-RFOU R,SKIENA S.Deepwalk:Online learning of social representations[C]//20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.2014:701-710.
[15] GROVER A,LESKOVEC J.node2vec:Scalable Feature Lear-ning for Networks[C]//22th ACM SIGKDD International Confe-rence on Knowledge Discovery and Data Mining.2016:855-864.
[16] MIKOLOV T,CHEN K,CORRADOG S,et al.Efficient Esti-mation of Word Representations in Vector Space[C]//International Conference on Learning Representations (ICLR).2013.
[17] OUM D,CUI P,PEI J,et al.Asymmetric Transitivity Preserving Graph Embedding[C]//22th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.2016:1105-1114.
[18] ZHANG J,DONG Y,WANG Y,et al.ProNE:Fast and Scalable Network Representation Learning[C]//28th International Joint Conference on Artificial Intelligence (IJCAI).2019:4278-4284.
[19] KIPF T,WELLING M.Semi-Supervised Classification withGraph Convolutional Networks[C]//International Conference on Learning Representations (ICLR).2016.
[20] HAMILTON W,YING R,LESKOVEC J.Inductive Represen-tation Learning on Large Graphs[C]//31th Conference on Neural Information Processing Systems (NeurIPS).2017:1024-1034.
[21] YING R,HE R,CHEN K,et al.Graph Convolutional Neural Networks for Web-Scale Recommender Systems[C]//24th ACM SIGKDD International Conference on Knowledge Disco-very and Data Mining.2018:947-983.
[22] VELICKOVIC P,CUCURULL G,CASANOV A A,et al.Graph Attention Networks[C]//International Conference on Learning Representations (ICLR).2018.
[23] GRAVES A,WAYNE G,DANIHELKA L.Neural Turing Machines [EB/OL].(2014-12-10)[2021-08-30].https://arXiv.org/abs/1410.5401.
[24] TANG L,LIU H.Relational learning via latent social dimensions[C]//15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.2009:817-826.
[25] LESKOVEC J,HUTTENLOCHER D,KLEINBERG J.Predicting Positive and Negative Links in Online Social Networks[C]//19th International Conference on World Wide Web (WWW).2010:641-650.
[26] YIN H,BENSON A,LESKOVEC J,et al.Local Higher-OrderGraph Clustering[C]//23th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.2017:555-564.
[27] LI Y,MENG X F,ZHANG Q,et al.Common patterns of online collective attention flow[J].Science China Information Sciences,2017,60(5):059102.
[28] TANG J,QU M,WANG M Z,et al.LINE:Large-scale Information Network Embedding[C]//24th International Conference on World Wide Web (WWW).2015:1067-1077.
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] SONG Jie, LIANG Mei-yu, XUE Zhe, DU Jun-ping, KOU Fei-fei. Scientific Paper Heterogeneous Graph Node Representation Learning Method Based onUnsupervised Clustering Level [J]. Computer Science, 2022, 49(9): 64-69.
[3] HUANG Li, ZHU Yan, LI Chun-ping. Author’s Academic Behavior Prediction Based on Heterogeneous Network Representation Learning [J]. Computer Science, 2022, 49(9): 76-82.
[4] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[5] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[6] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[7] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[8] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[9] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[10] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[11] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[12] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[13] JIN Fang-yan, WANG Xiu-li. Implicit Causality Extraction of Financial Events Integrating RACNN and BiLSTM [J]. Computer Science, 2022, 49(7): 179-186.
[14] XIONG Luo-geng, ZHENG Shang, ZOU Hai-tao, YU Hua-long, GAO Shang. Software Self-admitted Technical Debt Identification with Bidirectional Gate Recurrent Unit and Attention Mechanism [J]. Computer Science, 2022, 49(7): 212-219.
[15] PENG Shuang, WU Jiang-jiang, CHEN Hao, DU Chun, LI Jun. Satellite Onboard Observation Task Planning Based on Attention Neural Network [J]. Computer Science, 2022, 49(7): 242-247.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!