Computer Science ›› 2023, Vol. 50 ›› Issue (1): 52-58.doi: 10.11896/jsjkx.220900032

• Database & Big Data & Data Science • Previous Articles     Next Articles

Fast Computation Graph Simplification via Influence-based Pruning for Graph Neural Network

GU Xizhi, SHAO Yingxia   

  1. School of Computer Science,Beijing University of Posts and Telecommunications,Beijing 100876,China
  • Received:2022-09-05 Revised:2022-10-28 Online:2023-01-15 Published:2023-01-09
  • About author:GU Xizhi,born in 1998,postgraduate,is a student member of China Computer Federation.His main research interest is large-scale graph neural network.
    SHAO Yingxia,born in 1988,Ph.D,associate professor,is a senior member of China Computer Federation.His main research interests include large-scale graph analysis,parallel computing framework and graph learning.
  • Supported by:
    National Natural Science Foundation of China(62272054,U1936104,62192784).

Abstract: Computation graph simplification is a kind of optimization technique to improve the training speed of graph neural network models.It uses the characteristics of common neighbors among nodes and speeds up the training of graph neural network models by eliminating redundant computation in the stage of aggregation.However,when dealing with large-scale graph data,the existing computation graph simplification techniques suffer from the problem of low computation efficiency,which affects the application of computation graph simplification in large-scale graph neural networks.This paper analyzes the current techniques of computation graph simplification in detail by counting the overhead of two phases including searching and reconstruction,and summarizes the shortcomings of existing techniques.On this basis,it proposes an algorithm of fast computation graph simplification via influence-based pruning for graph neural network.This algorithm applies an influence model to describe the contribution of each node to the computation graph simplification and prunes the searching space of common neighbors based on influence,which greatly improves the efficiency of the phase of searching.In addition,this paper analyzes the algorithm complexity and theoretically proves the expected acceleration effect of this technique.Finally,in order to verify the effectiveness of this novel algorithm,the algorithm is applied to two mainstream computation graph simplification technique,and common graph neural network models areselected to test on some data sets.Experimental results demonstrate that the novel algorithm can significantly improve the efficiency of the computation graph simplification on the premise of ensuring a certain amount of redundant computation reduction.Compared with the baseline of computation graph simplification,the proposed technique can speed up to 3.4 times in searching phase and speed up to 1.6 times on the whole process on PPI dataset,while it can speed up to 5.1 times in searching phase and speed up to 3.2 times on the whole process on Reddit dataset.

Key words: Graph neural network, Computation graph simplification, Common neighbors, Redundant computation, Pruning

CLC Number: 

  • TP391
[1]GIRVAN M,NEWMAN M E J.Community structure in social and biological networks[J].Proceedings of the National Academy of Sciences,2002,99(12):7821-7826.
[2]KIPF T N,WELLING M.Semi-supervised classification withgraph convolutional networks[J].arXiv:1609.02907,2016.
[3]VELIČKOVIĆ P,CUCURULL G,CASANOVA A,et al.Graph attention networks[J].arXiv:1710.10903,2017.
[4]XU K,HU W,LESKOVEC J,et al.How powerful are graph neural networks?[J].arXiv:1810.00826,2018.
[5]HAMILTON W,YING Z,LESKOVEC J.Inductive representation learning on large graphs[C]//Advances in Neural Information Processing Systems.2017:1025-1035.
[6]CHEN J,MA T,XIAO C.Fastgcn:fast learning with graph convolutional networks via importance sampling[J].arXiv:1801.10247,2018.
[7]ZOU D,HU Z,WANG Y,et al.Layer-dependent importance sampling for training deep and large graph convolutional networks[C]//Advances in Neural Information Processing Systems.2019:11249-11259.
[8]ZENG H,ZHOU H,SRIVASTAVA A,et al.Graphsaint:Graph sampling based inductive learning method[J].arXiv:1907.04931,2019.
[9]CHIANG W L,LIU X,SI S,et al.Cluster-gcn:An efficient algorithm for training deep and large graph convolutional networks[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.2019:257-266.
[10]WU F,SOUZA A,ZHANG T,et al.Simplifying graph convolutional networks[C]//International Conference on Machine Learning.PMLR,2019:6861-6871.
[11]BOJCHEVSKI A,KLICPERA J,PEROZZI B,et al.Scalinggraph neural networks with approximate pagerank[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.2020:2464-2473.
[12]ZHU R,ZHAO K,YANG H,et al.Aligraph:a comprehensive graph neural network platform[J].arXiv:1902.08730,2019.
[13]CHEN X,WANG Y,XIE X,et al.Rubik:A hierarchical architecture for efficient graph neural network training[J].IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems,2021,41(4):936-949.
[14]JIA Z,LIN S,YING R,et al.Redundancy-free computation for graph neural networks[C]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.2020:997-1005.
[15]ZENG H,PRASANNA V.GraphACT:Accelerating GCN training on CPU-FPGA heterogeneous platforms[C]//Proceedings of the 2020 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays.2020:255-265.
[16]FEY M,LENSSEN J E.Fast graph representation learning with PyTorch Geometric[J].arXiv:1903.02428,2019.
[1] HAO Jingyu, WEN Jingxuan, LIU Huafeng, JING Liping, YU Jian. Deep Disentangled Collaborative Filtering with Graph Global Information [J]. Computer Science, 2023, 50(1): 41-51.
[2] PU Jinyao, BU Lingmei, LU Yongmei, YE Ziming, CHEN Li, YU Zhonghua. Utilizing Heterogeneous Graph Neural Network to Extract Emotion-Cause Pairs Effectively [J]. Computer Science, 2023, 50(1): 205-212.
[3] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[4] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[5] QI Xiu-xiu, WANG Jia-hao, LI Wen-xiong, ZHOU Fan. Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning [J]. Computer Science, 2022, 49(7): 18-24.
[6] YANG Bing-xin, GUO Yan-rong, HAO Shi-jie, Hong Ri-chang. Application of Graph Neural Network Based on Data Augmentation and Model Ensemble in Depression Recognition [J]. Computer Science, 2022, 49(7): 57-63.
[7] DENG Zhao-yang, ZHONG Guo-qiang, WANG Dong. Text Classification Based on Attention Gated Graph Neural Network [J]. Computer Science, 2022, 49(6): 326-334.
[8] XIONG Zhong-min, SHU Gui-wen, GUO Huai-yu. Graph Neural Network Recommendation Model Integrating User Preferences [J]. Computer Science, 2022, 49(6): 165-171.
[9] YU Ai-xin, FENG Xiu-fang, SUN Jing-yu. Social Trust Recommendation Algorithm Combining Item Similarity [J]. Computer Science, 2022, 49(5): 144-151.
[10] LI Yong, WU Jing-peng, ZHANG Zhong-ying, ZHANG Qiang. Link Prediction for Node Featureless Networks Based on Faster Attention Mechanism [J]. Computer Science, 2022, 49(4): 43-48.
[11] CAO He-xin, ZHAO Liang, LI Xue-feng. Technical Research of Graph Neural Network for Text-to-SQL Parsing [J]. Computer Science, 2022, 49(4): 110-115.
[12] MIAO Xu-peng, ZHOU Yue, SHAO Ying-xia, CUI Bin. GSO:A GNN-based Deep Learning Computation Graph Substitutions Optimization Framework [J]. Computer Science, 2022, 49(3): 86-91.
[13] YANG Xu-hua, JIN Xin, TAO Jin, MAO Jian-fei. Text Classification Based on Graph Neural Networks and Dependency Parsing [J]. Computer Science, 2022, 49(12): 293-300.
[14] SUN Kai-wei, LIU Song, DU Yu-lu. Movie Recommendation Model Based on Attribute Graph Attention Network [J]. Computer Science, 2022, 49(11A): 211100106-8.
[15] LI Bao-zhen, ZHANG Jin, WANG Bao-lu, YU Ping. Human-Object Interaction Recognition Integrating Multi-level Visual Features [J]. Computer Science, 2022, 49(11A): 220700012-8.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!