Computer Science ›› 2024, Vol. 51 ›› Issue (6A): 230600209-8.doi: 10.11896/jsjkx.230600209

• Artificial Intelligenc • Previous Articles     Next Articles

Knowledge Reasoning Model Combining HousE with Attention Mechanism

ZHU Yuliang, LIU Juntao, RAO Ziyun, ZHANG Yi, CAO Wanhua   

  1. Wuhan Digital Engineering Institute,Wuhan 430205,China
  • Published:2024-06-06
  • About author:ZHU Yuliang,born in 1999,postgra-duate.His main research interests include knowledge reasoning and recommender system.
    LIU Juntao,born in 1979 Ph.D,professor.His main research interests include recommender system,knowledge computing,and decision support.
  • Supported by:
    14th Five Year Equipment Pre-research Project(50902010503).

Abstract: Knowledge reasoning technology is a method proposed to solve the problem of missing knowledge graphs and has been continuously developed in recent years.In order to solve the problems of low accuracy,poor interpretability,and weak applicability in knowledge reasoning,a knowledge reasoning model called Att-HousE,which combines HousE with Attention Mechanism,is proposed.It consists of a rule generator with attention mechanism and a rule predictor with HousE.The rule generator generates the rules required for reasoning and passes them into the predictor,which updates and then obtains scores for different rules.After that,the generator and predictor are continuously trained and optimized by the EM algorithm.Specifically,the model is based on RNNLogic and has been improved.The attention mechanism can select more noteworthy relationships as rules,improving the accuracy of the model.HousE has more flexibility in handling complex relationships and is suitable for establishing multilateral relationships.According to experimental results on public datasets,it indicates that Att-HousE’s MRR is 6.3% higher than that of RNNLogic when doing reasoning tasks on FB15K-237.For the sparse dataset WN18RR,the Hits@10 of Att-HousE is 2.7% higher than that of RNNLogic.It is demonstrated that the introduction of HousE and attention mechanism can more comprehensively grasp and form multilateral relationships,which can improve the accuracy of knowledge reasoning.

Key words: Knowledge graph completion, Knowledge reasoning, Attention mechanism, Knowledge representation, EM algorithm

CLC Number: 

  • TP391
[1]HUANG Q H,YU J,LIAO X,et al.Review of KnowledgeGraph Research[J].Computer Systems & Applications,2019,28(6):1-12.
[2]BOLLACKER K,EVANS C,PARITOSH P,et al.Freebase:A collaboratively created graph database for structuring human knowledge[C]//Proceedings of 2008 ACM SIGMOD International Conference on Management of Data.2008:1247-1250.
[3]MILLERG A.WordNet:A lexical database for English[J].Communications of the ACM,1995,38(11):39-41.
[4]CARLSON A,BETTERIDGE J,KISIEL B,et al.Toward an ar-chitecture for never-ending language learning[C]//Proceedings of the 24th AAAI Conference on Artificial Intelligence.Menlo Park:AAAI,2010:1306-1313.
[5]XU Z L,SHENG Y P,HE L R,et al.Overview of Knowledge Graph Technology[J].Journal of University of Electronic Science and Technology of China,2016,45(4):589-606.
[6]TIAN L,ZHANG J C,ZHANG J H,et al.Overview of Know-ledge graph:Representation,Construction,Reasoning and Knowledge Hypergraph Theory[J].Computer Systems & Applications,2021,41(8):2161-2186.
[7]LAO N,COHEN W W.Relational retrieval using a combination of path-constrained random walks[J].Machine Learning,2010,81(1):53-67.
[8]SUN S,CHEN J,LIU D,et al.A Posterior-Based Method forMarkov Logic Networks Parameters Learning[C]//2006 5th IEEE International Conference on Cognitive Informatics.Beijing,China,2006:529-534.
[9]YANG F,YANG Z L,COHEN W W.Differentiable Learning of Logical Rules for Knowledge Base Reasoning[J].arXiv:1702.08367,2017.
[10]ASCHEL T R,RIEDEL S.End-to-end differentiable proving[J].Advances in Neural Information Processing Systems,2017,2017(December):3789-3801.
[11]BORDES A,USUNIER N,GARCIA-DURAN A,et al.Translating embeddings for modeling multi-relational data[C]//Advances in Neural Information Processing Systems.2013:2787-2795.
[12]SUN Z,DENG Z H,NIE J Y,et al.Rotate:Knowledge graph embedding by relational rotation in complex space[C]//7th International Conference on Learning Representations.2019.
[13]LI R et al.HousE:Knowledge Graph Embedding with Householder Parameterization[J].International Conference on Machine Learning,2022.
[14]KHOT T,NATARAJAN S,KERSTING K,et al.Learningmarkov logic networks via functional gradient boosting[C]//ICDM.2011.
[15]MEILICKE C,CHEKOL M W,RUFFINELLI D,et al.Anytime bottom-up rule learning for knowledge graph completion[C]//IJCAI,2019.
[16]MINERVINI P,BOŠNJAK M,ROCKTÄSCHELT,et al.Dif-ferentiable Reasoning on Large Knowledge Bases and Natural Language[J].arXiv:1912.0824,2019.
[17]RONAN C,WESTON J.A unified architecture for natural language processing:deep neural networks with multitask learning[C]//International Conference on Machine Learning.2008.
[18]TROUILLON T,WELBL J,RIEDEL S,et al.Complex embed-dings for simple link prediction[C]//ICML.2016.
[19]ZHANG S,TAY Y,YAO L N,et al.Quaternion knowledgegraph embeddings[C]//Proceedings of the 33rd Conference on Neural Information Processing Systems.2019:2735.
[20]XIONG W,HOANG T,WANG W Y.Deeppath:A reinforcement learning method for knowledge graph reasoning[J].arXiv:1707.06690,2017.
[21]LIN Y,LIU Z,LUAN H,et al.Modeling relation paths for representation learning of knowledge bases[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.2015:705-714.
[22]LI N,SHEN Q,SONG R,et al.MEduKG:A Deep-Learning-Based Approach for Multi-Modal Educational Knowledge Graph Construction[J].Information,2022,13(2):91.
[23]QU M,CHEN J,XHONNEUX L,et al.RNNLogic:Learning Logic Rules for Reasoning on Knowledge Graphs[J].arXiv:2010.04029,2020.
[24]TOUTANOVA K,CHEN D Q.Observed versus latent features for knowledge base and text inference[C]//Workshop on Continuous Vector Space Models and their Compositionality.2015.
[25]DETTMERS T,MINERVINI P,STENETORP P,et al.Convolutional 2d knowledge graph embeddings[C]//AAAI.2018.
[1] BAI Yu, WANG Xinzhe. Study on Hypernymy Recognition Based on Combined Training of Attention Mechanism and Prompt Learning [J]. Computer Science, 2024, 51(6A): 230700226-5.
[2] WANG Guogang, DONG Zhihao. Lightweight Image Semantic Segmentation Based on Attention Mechanism and Densely AdjacentPrediction [J]. Computer Science, 2024, 51(6A): 230300204-8.
[3] ZHANG Le, YU Ying, GE Hao. Mural Inpainting Based on Fast Fourier Convolution and Feature Pruning Coordinate Attention [J]. Computer Science, 2024, 51(6A): 230400083-9.
[4] SUN Yang, DING Jianwei, ZHANG Qi, WEI Huiwen, TIAN Bowen. Study on Super-resolution Image Reconstruction Using Residual Feature Aggregation NetworkBased on Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230600039-6.
[5] QUE Yue, GAN Menghan, LIU Zhiwei. Object Detection with Receptive Field Expansion and Multi-branch Aggregation [J]. Computer Science, 2024, 51(6A): 230600151-6.
[6] HE Xinyu, LU Chenxin, FENG Shuyi, OUYANG Shangrong, MU Wentao. Ship Detection and Recognition of Optical Remote Sensing Images for Embedded Platform [J]. Computer Science, 2024, 51(6A): 230700117-7.
[7] ZHAI Yunkai, QIAO Zhengwen, QIAO Yan. Forecasting Teleconsultation Demand Based on LSTM and Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230800119-7.
[8] ZHANG Lanxin, XIANG Ling, LI Xianze, CHEN Jinpeng. Intelligent Fault Diagnosis Method for Rolling Bearing Based on SAMNV3 [J]. Computer Science, 2024, 51(6A): 230700167-6.
[9] LIU Xiaohu, CHEN Defu, LI Jun, ZHOU Xuwen, HU Shan, ZHOU Hao. Speaker Verification Network Based on Multi-scale Convolutional Encoder [J]. Computer Science, 2024, 51(6A): 230700083-6.
[10] WU Chunming, WANG Tiaojun. Study on Defect Detection Algorithm of Transmission Line in Complex Background [J]. Computer Science, 2024, 51(6A): 230500178-6.
[11] LYU Yiming, WANG Jiyang. Iron Ore Image Classification Method Based on Improved Efficientnetv2 [J]. Computer Science, 2024, 51(6A): 230600212-6.
[12] WU Chunming, LIU Yali. Method for Lung Nodule Detection on CT Images Using Improved YOLOv5 [J]. Computer Science, 2024, 51(6A): 230500019-6.
[13] XIAO Yahui, ZHANG Zili, HU Xinrong, PENG Tao, ZHANG Jun. Clothing Image Segmentation Method Based on Deeplabv3+ Fused with Attention Mechanism [J]. Computer Science, 2024, 51(6A): 230900153-7.
[14] LANG Lang, CHEN Xiaoqin, LIU Sha, ZHOU Qiang. Detection of Pitting Defects on the Surface of Ball Screw Drive Based on Improved Deeplabv3+ Algorithm [J]. Computer Science, 2024, 51(6A): 240200058-6.
[15] QIAO Hong, XING Hongjie. Attention-based Multi-scale Distillation Anomaly Detection [J]. Computer Science, 2024, 51(6A): 230300223-11.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!