计算机科学 ›› 2025, Vol. 52 ›› Issue (11A): 241200012-7.doi: 10.11896/jsjkx.241200012
李鹏彦, 王宝会
LI Pengyan, WANG Baohui
摘要: 在知识图谱补全领域,实体和关系之间丰富的多语义信息对于提升补全任务的准确性具有重要意义。然而,现有模型往往难以充分捕捉和整合这些多语义特征,导致补全效果受限。为此,提出了一种基于多语义提取的知识图谱补全模型(Knowledge Graph Completion Model Based on Multi-Semantic Extraction,MSE)。首先,设计了多语义聚合编码器,对实体和关系嵌入进行维度切分,整合了邻居实体和关系的多语义信息。其次,设计了基于多尺度卷积的解码器,使用不同大小的卷积核提取实体自身的深层语义特征。最后,设计了融合独立性约束的损失函数,引入基于Pearson 相关系数的正则化项,以提升模型多语义表达能力。实验结果表明,在FB15k-237和WN18RR数据集上,相较于其他基线的最优模型,MSE模型的MRR(Mean Reciprocal Rank)值分别提升1.7%和2.3%,验证了其在知识图谱补全任务上的有效性。
中图分类号:
| [1]BORDES A,USUNIER N,GARCIA-DURAN A,et al.Translating embeddings for modeling multi-relational data[J].Advances in Neural Information Processing Systems,2013,26. [2]WANG Z,ZHANG J,FENG J,et al.Knowledge graph embedding by translating on hyperplanes[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2014. [3]LIN Y,LIU Z,SUN M,et al.Learning entity and relation embeddings for knowledge graph completion[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2015. [4]JI G,HE S,XU L,et al.Knowledge graph embedding via dyna-mic mapping matrix[C]//Proceedings of the 53rd Annual Mee-ting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing(volume 1:Long papers).2015:687-696. [5]NICKEL M,TRESP V,KRIEGELH P.A three-way model for collective learning on multi-relational data[C]//ICML.2011:3104482-3104584. [6]YANG B,YIH W,HE X,et al.Embedding entities and relations for learning and inference in knowledge bases[J].arXiv:1412.6575,2014. [7]TROUILLON T,WELBL J,RIEDEL S,et al.Complex embeddings for simple link prediction[C]//International Conference on Machine Learning.PMLR,2016:2071-2080. [8]BALAEVIĆ I,ALLEN C,HOSPEDALES T M.Tucker:Tensor factorization for knowledge graph completion[J].arXiv:1901.09590,2019. [9]DETTMERS T,MINERVINI P,STENETORP P,et al.Convolutional 2d knowledge graph embeddings[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2018. [10]VASHISHTH S,SANYAL S,NITIN V,et al.Interacte:Improving convolution-based knowledge graph embeddings by increa-sing feature interactions[C]//Proceedings of the AAAI Confe-rence on Artificial Intelligence.2020:3009-3016. [11]NGUYEN D Q,NGUYEN T D,NGUYEN D Q,et al.A novel embedding model for knowledge base completion based on convo-lutional neural network[J].arXiv:1712.02121,2017. [12]NGUYEN D Q,VU T,NGUYEN T D,et al.A capsule network-based embedding model for knowledge graph completion and search personalization[J].arXiv:1808.04122,2018. [13]YANG X,WANG N.A confidence-aware and path-enhancedconvolutional neural network embedding framework on noisy knowledge graph[J].Neurocomputing,2023,545:126261. [14]GORI M,MONFARDINI G,SCARSELLI F.A new model for learning in graph domains[C]//2005 IEEE International Joint Conference on Neural Networks.IEEE,2005,729-734. [15]WU B,LIANG X,ZHANG S S,et al.Advances and Applications in Graph Neural Network[J].Chinese Journal of Compu-ters,2022,45(1):35-68. [16]SCHLICHTKRULL M,KIPF T N,BLOEM P,et al.Modeling relational data with graph convolutional networks[C]//The Semantic Web:15th International Conference(ESWC 2018).Heraklion,Crete,Greece,Springer International Publishing,2018:593-607. [17]SHANG C,TANG Y,HUANG J,et al.End-to-end structure-aware convolutional networks for knowledge base completion[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:3060-3067. [18]JIN Y,YANG L.Graph-aware tensor factorization convolutional network for knowledge graph completion[J].International Journal of Machine Learning and Cybernetics,2024,15(5):1755-1766. [19]VASHISHTH S,SANYAL S,NITIN V,et al.Composition-based multi-relational graph convolutional networks[J].arXiv:1911.03082,2019. [20]NATHANI D,CHAUHAN J,SHARMA C,et al.Learning attention-based embeddings for relation prediction in knowledge graphs[J].arXiv:1906.01195,2019. [21]DONG B,BU C,WANG Y,et al.Disentangled Multi-viewGraph Neural Network for multilingual knowledge graph completion[J].Applied Soft Computing,2025,183:113605. [22]TOUTANOVA K,CHEN D.Observed versus latent featuresfor knowledge base and text inference[C]//Proceedings of the 3rd Workshop on Continuous Vector Space Models and Their Compositionality.2015:57-66. |
|
||