计算机科学 ›› 2026, Vol. 53 ›› Issue (3): 331-340.doi: 10.11896/jsjkx.250200101

• 人工智能 • 上一篇    下一篇

融合本体和实例的知识图谱嵌入模型

秦晶, 李贯峰, 陈昱胤, 肖毓航   

  1. 宁夏大学信息工程学院 银川 750021
    宁夏“东数西算”人工智能与信息安全重点实验室 银川 750021
  • 收稿日期:2025-02-25 修回日期:2025-04-28 发布日期:2026-03-12
  • 通讯作者: 李贯峰(ligf@nxu.edu.cn)
  • 作者简介:(2720753621@qq.com)
  • 基金资助:
    宁夏全职引进高层次人才科研启动项目(2023BSB03066);宁夏自然科学基金(2024AAC03098);国家自然科学基金(62066038);宁夏大学2025研究生创新项目(CXXM2025-038)

Embedding Model of Knowledge Graph via Jointly Modeling Ontology and Instances

QIN Jing, LI Guanfeng, CHEN Yuyin, XIAO Yuhang   

  1. School of Information Engineering, Ningxia University, Yinchuan 750021, China
    Ningxia Key Laboratory of Artificial Intelligence and Information Security for Channeling Computing Resources from the East to the West, Yinchuan 750021, China
  • Received:2025-02-25 Revised:2025-04-28 Online:2026-03-12
  • About author:QIN Jing,born in 2000,postgraduate,is a member of CCF(No.Y8902G).Her main research interest is knowledge graph embedding and reasoning.
    LI Guanfeng,born in 1979,Ph.D,associate professor,is a member of CCF(No.A0519M).His mian research interests include knowledge engineering and intelligent computing.
  • Supported by:
    Full-time Introduction of High-level Talent Research Start-up Project Foundation of Ningxia(2023BSB03066),Natural Science Foundation of Ningxia Province(2024AAC03098),National Natural Science Foundation of China (62066038) and 2025 Graduate Innovation Project of Ningxia University(CXXM2025-038).

摘要: 知识图谱嵌入通过将实体和关系投影到连续的低维向量空间中,为机器学习模型提供更强大的知识表示输入,从而支撑更多的知识图谱应用场景。近年来,研究人员试图利用知识图谱中的本体和实例之间的潜在语义信息来增强知识图谱的嵌入。然而,它们未能有效融合概念的层次结构和实例的特定信息,并且忽略了isA关系之间的传递性,导致模型在处理知识图谱中的长尾实体时的性能和泛化能力受限。为了弥补上述不足,提出了一个融合了本体和实例的知识图谱嵌入模型JMOI(Representation Learning of Knowledge Graph via Jointly Modeling Ontology and Instances)。该模型通过引入自注意力机制,能够捕捉到概念和实例之间复杂的语义关系,并增加了一个可学习的参数来调整概念嵌入的邻域范围,以区分不同概念的层次信息,从而对isA关系的传递性进行建模。在YAGO26K-906和DB111K-174数据集上的实验结果表明,与现有技术相比,JMOI在大多数情况下都达到了最佳性能,与次优模型相比,在链接预测 Hits@1 指标上最大提升了6.5%,在三元组分类中召回率指标最大提升了 6.9%。

关键词: 知识图谱, 知识图谱嵌入, 概念和实例, 链接预测, 三元组分类

Abstract: Knowledge graph embedding provides a more powerful knowledge representation input to machine learning models by projecting entities and relationships into a continuous low-dimensional vector space,thereby supporting more knowledge graph application scenarios.In recent years,researchers have tried to use the potential semantic information between ontology and instance in knowledge graph to enhance the embedding of knowledge graph.However,they fail to effectively integrate the hierarchical structure of concepts and the specific information of instances,and ignore the transitivity between isA relationships,resulting in limited performance and generalization ability of the models when dealing with long-tail entities in the knowledge graph.In order to solve the above shortcomings,this paper proposes a knowledge graph embedding model(Representation Learning of Knowledge Graph via Jointly Modeling Ontology and Instances,JMOI),which integrates ontology and instance.By introducing self-attention mechanism,this model captures the complex semantic relationship between concepts and instances,and adds a learnable parameter to adjust the neighborhood range of concept embedding,so as to distinguish the hierarchical information of diffe-rent concepts.The transitivity of isA relationship is modeled.Experimental results on the YAGO26K-906 and DB111K-174 datasets show that JMOI achieves the best performance in most cases compared with the prior art,with a maximum improvement of 6.5% in the link prediction Hits@1 and 6.9% in the Recall in triple classification compared with the suboptimal model.

Key words: Knowledge graph, Knowledge graph embedding, Concepts and instances, Link prediction, Triple classification

中图分类号: 

  • TP391.1
[1]LIANG K,MENG L,LIU M,et al.A survey of knowledgegraph reasoning on graph types:static,dynamic,and multi-modal[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2024(12):46.
[2]ZHENG Z,XU Y,LIU Y,et al.Distantly supervised relation extraction based on residual attention and self learning[J].Neural Processing Letters,2024,56(3):1-16.
[3]TIAN L,ZHOU X,WU Y,et al.Knowledge graph and know-ledge reasoning:A systematic review[J].Journal of Electronic Science and Technology,2022,20(2):159-186.
[4]WANG D,ZHOU S,HUANG J,et al.Triple alignment-en-hanced complex question answering over knowledge bases[J].Neurocomputing,2024,588:127709.
[5]LI Y,ZHU C.TransE-MTP:A new representation learningmethod for knowledge graph embedding with multi-translation principles andtransE[J].Electronics,2024,13(16):3171.
[6]ZHANG Z,GUAN Z,ZHANG F,et al.Weighted knowledgegraph embedding[C]//SIGIR’23:Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval.New York:ACM,2023:867-877.
[7]WANG S,FU K,SUN X,et al.Representation learning ofknowledge graphs with the interaction between entity types and relations[J].Neurocomputing,2022,508:305-314.
[8]WANG P,ZHOU J.JECI++:A modified joint knowledge graph embedding model for concepts and instances[J].Big Data Research,2020,24,100160.
[9]LYU L,HOU L,LIU Z.Differentiating concepts and instances for knowledge graph embedding[J].arXiv:1811.04588,2018.
[10]HAO J,CHEN M,YU W,et al.Universal representation lear-ning of knowledge bases by jointly embedding instances and ontological concepts[C]//KDD:ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.New York:ACM,2019:1709-1719.
[11]BAI Y,WANG W,SUN Y.Dual-Geometric space embeddingmodel for two-view knowledge graphs[C]//KDD ’22:Procee-dings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.New York:ACM,2022:5033-5041.
[12]ZHANG P,CHEN D,FANG Y,et al.CIST:Differentiating concepts and instances based on spatial transformation for know-ledge graph embedding[J].Mathematics,2022,10:1-18.
[13]HUANG Z,SUN Y,WANG W,et al.Concept2Box:joint geometric embeddings for learning two-view knowledge graphs[C]//ACL-Findings:Annual Meeting of the Association for Computational Linguistics.Association for Computational Linguistics,2023:10105-10118.
[14]LI Z,LIU X,WANG X,et al.TransO:a knowledge-driven representation learning method with ontology information constraints[J].World Wide Web,2023,26(1):297-319.
[15]WANG K,QI G,WU T,et al.Embedding ontologies via incorporating extensional and intensional knowledge[J].Data Intelligence,2024,6(2):175-190.
[16]MILLER G.WordNet:A lexical database for english[J].Communications of the ACM,1995,38:39-41.
[17]BORDES A,USUNIER N,GARCIA-DURAN A.et al.Translating embeddings for modeling multi-relational data[C]//NIPS:26th International Conference on Neural Information Processing Systems.Lake Tahoe,Nevada:MIT Press,2013:2787-2795.
[18]YANG B,YIH W,HE X,et al.Embedding entities and relations for learning and inference in knowledge bases[C]//ICLR:3rd International Conference on Learning Representations.San Diego,CA:Computational and Biological Learning Society,2015.
[19]NICKEL M,ROSASCO L,POGGIO T A.Holographic embeddings of knowledge graphs[J].arXiv:1510.04935,2015.
[20]WANG Z,ZHANG J,FENG J,et al.Knowledge graph embedding by translating on hyperplanes[C]//AAAI:Twenty-Eighth AAAI Conference on Artificial Intelligence.AAAI,2014:1112-1119.
[21]LIN Y,LIU Z,SUN M,et al.Learning entity and relation embeddings for knowledge graph completion[C]//AAAI:Twenty-Ninth AAAI Conference on Artificial Intelligence.AAAI,2015:2181-2187.
[22]JI G,HE S,XU L,et al.Knowledge graph embedding via dynamic mapping matrix[C]//ACL-IJCNLP:53rd Annual Mee-ting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.Association for Computational Linguistics,2015:687-696.
[23]CHEN X J,XIANG Y.STransH:A revised translation-based model for knowledge representation[J].Computer Science,2019,46(9):184-189.
[24]BIN C,QIN S,RAO G,et al.Multiview translation learning for knowledge graph embedding[J].Scientific Programming,2020,2020:1-9.
[25]NICKEL M,TRESP V,KRIEGEL H P,et al.A three-way mo-del for collective learning on multi-relational data[C]//ICML:International Conference on Machine Learning.Bellevue,WA:ICML,2011:3104482-3104584.
[26]LIU Y,YAO Q,LI Y.Generalizing tensor decomposition for n-ary relational knowledge bases[C]//WWW:The Web Confe-rence.ACM,2020:1104-1114.
[27]DI S,YAO Q,CHEN L.Searching to sparsify tensor decomposition for nary relational data[C]//WWW:The Web Conference.ACM,2021:4043-4054.
[28]DETTMERS T,MINERVINI P,STENETORP P,et al.Convolutional 2d knowledge graph embeddings[C]//AAAI:Thirtieth AAAI Conferenceon Artificial Intelligence.AAAI,2018.
[29]NGUYEN D,NGUYEN T,NGUYEN D,et al.A Novel embedding model for knowledge base completion based on convolutional neural network[C]//NAACL-HLT:2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Association for Computational Linguistics,2018:327-333.
[30]GUO L,SUN Z,HU W.Learning to exploit long-term relational dependencies in knowledge graphs[J].arXiv:1905.04914,2019.
[31]ZHANG X,ZHANG C,GUO J,et al.Graph attention network with dynamic representation of relations for knowledge graph completion[J].Expert Systems With Applications,2023,219:119616.
[32]XU H,BAO J,LIU W.Double-Branch multi-attention based graphneural network for knowledge graph completion[C]//ACL:61st Annual Meeting of the Association for Computatio-nal Linguistics.Association for Computational Linguistics,2023:15257-15271.
[33]LI X,TIAN Y,JI S.Semantic-and relation-based graph neural network for knowledge graph completion[J].Applied Intelligence,2024,54(8):6085-6107.
[34]WANG Z,LI J.Text-Enhanced representation learning forknowledge graph[C]//IJCAI:Twenty-Fifth International Joint Conference on Artificial Intelligence.New York:IJCAI,2016:1293-1299.
[35]GUAN N,SONG D,LIAO L.Knowledge graph embedding with concepts[J].Knowledge-Based Systems,2019,164:38-44.
[36]ZHANG J D,LI J.Knowledge graph embedding combining with hierarchical type information[J].Journal of Software,2022,33,3331-3346.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!