计算机科学 ›› 2023, Vol. 50 ›› Issue (11): 201-209.doi: 10.11896/jsjkx.221100217

• 人工智能 • 上一篇    下一篇

QubitE:用于知识图谱补全的量子嵌入模型

林学渊, 鄂海红, 宋文宇, 罗浩然, 宋美娜   

  1. 北京邮电大学计算机学院 北京 100876
  • 收稿日期:2022-11-25 修回日期:2023-02-09 出版日期:2023-11-15 发布日期:2023-11-06
  • 通讯作者: 鄂海红(ehaihong@bupt.edu.cn)
  • 作者简介:(linxy59@bupt.edu.cn)
  • 基金资助:
    国家自然科学基金(62176026,61902034);北京市自然科学基金(M22009)

QubitE:Qubit Embedding for Knowledge Graph Completion

LIN Xueyuan, E Haihong , SONG Wenyu, LUO Haoran, SONG Meina   

  1. School of Computer Science,Beijing University of Posts and Telecommunications,Beijing 100876,China
  • Received:2022-11-25 Revised:2023-02-09 Online:2023-11-15 Published:2023-11-06
  • About author:LIN Xueyuan,born in 1998,postgra-duate.His main research interests include deep learning,knowledge graph,natural language processing,big data and artificial intelligence.E Haihong,born in 1982,Ph.D,asso-ciate professor,is a member of China Computer Federation.Her main research interests include deep learning,knowledge graph,natural language processing,big data and artificial intelligence.
  • Supported by:
    National Natural Science Foundation of China(62176026,61902034)and Natural Science Foundation of Beijing,China(M22009).

摘要: 知识图谱补全任务通过预测知识图谱中缺失的事实补全知识图谱。基于量子的知识图谱嵌入(KGE)模型利用变分量子电路,通过测量量子比特状态的概率分布对三元组进行评分,评分高的三元组即为缺失的事实。但是目前基于量子的KGE要么在优化过程中失去了量子优势,矩阵酉性被破坏,要么需要大量参数用于存储量子态,从而导致过拟合和低性能。此外,这些方法忽略了对于理解模型性能必不可少的理论分析。为了解决性能问题和弥合理论差距,提出了QubitE模型:将实体嵌入作为量子位(单位复向量),将关系嵌入作为量子门(酉复矩阵),评分过程为复矩阵乘法,利用核方法进行优化。该模型的参数化方式能在优化中保持量子优势,时空复杂度为线性,甚至可以进一步实现基于语义的量子逻辑计算。此外,从理论上可以证明该模型具有完全表达性、关系模式推理能力和包含性等,有助于理解模型性能。实验表明,QubitE在一些基准知识图谱上可以取得与最先进的经典模型相当的结果。

关键词: 知识图谱, 知识图谱补全, 知识图谱嵌入, 表示学习, 量子比特

Abstract: The knowledge graph completion task completes the knowledge graph by predicting missing facts in the knowledge graph.The quantum-based knowledge graph embedding(KGE) model uses variational quantum circuits to score triples by mea-suring the probability distribution of qubit states,and triples with high scores are the missing facts.But the current quantum-based KGE either loses the quantum advantage in the optimization process and the matrix unitary property is destroyed,or requires a large number of parameters for storing quantum states,resulting in overfitting and low performance.Furthermore,these methods ignore the theoretical analysis that is essential for understanding model performance.In order to solve the performance problem and bridge the theoretical gap,we propose QubitE:entities are embedded as qubits(unit complex vectors),relations are embedded as quantum gates(unit unitary matrices),the scoring process is complex matrix multiplication,and kernel methods are used for optimization.The parameterization method of the model can maintain the quantum advantage in optimization,the space-time complexity is linear,and it can even further realize semantic-based quantum logic calculation.In addition,the model can be proved to be fully expressive,relational schema reasoning ability and inclusiveness,etc.theoretically,which is helpful to understand the model performance.Experiments show that QubitE can achieve results comparable to state-of-the-art classical models on some benchmark knowledge graphs.

Key words: Knowledge graph, Knowledge graph completion, Knowledge graph embedding, Representation learning, Qubit

中图分类号: 

  • TP391
[1]MA Y P,TRESP V,ZHAO L M,et al.Variational QuantumCircuit Model for Knowledge Graph Embedding[J].Advanced Quantum Technologies,2019,2(7/8):1-13.
[2]BORDES A,USUNIER N,GARCIA-DURAN A et al.Translating Embeddings for Modeling Multi-Relational Data [C]//NIPS.2013.
[3]LIN Y K,LIU Z Y,SUN M S,et al.Learning Entity and Relation Embeddings for Knowledge Graph Completion[C]//AAAI.2015.
[4]SUN Z Q,DENG Z H,NIE J Y,et al.RotatE:Knowledge Graph Embedding by Relational Rotation in Complex Space [C]//International Conference on Learning Representations.2019.
[5]ZHANG S,TAY Y,YAO L N,et al.Quaternion KnowledgeGraph Embeddings [J].arXiv:1904.10281,2019.
[6]NAYYERI M,VAHDATI S, AYKUL C,et al.5* Knowledge Graph Embeddings with Projective Transformations [J].arXiv:2006.04986,2020.
[7]IVANA B,ALLEN C,TIMOTHY M.Multi-Relational Poincaré Graph Embeddings[C]//Advances in Neural Information Processing Systems 32:Annual Conference on Neural Information Processing Systems 2019.Vancouver,BC,Canada:NeurIPS 2019,2019:4465-4475.
[8]INES C,WOLF A,DA-CHENG JUAN D C,et al.Low-Dimensional Hyperbolic Knowledge Graph Embeddings [C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.online:ACL 2020,2020:6901-6914.
[9]YANG B S,YIH W T,HE X D,et al.Embedding Entities and Relations for Learning and Inference in Knowledge Bases [C]//3rd International Conference on Learning Representations,ICLR 2015.San Diego:Conference Track Proceedings,2015.
[10]THÉO T,WELBL J,RIEDEL S,et al.Complex Embeddings for Simple Link Prediction [C]//Proceedings of the 33nd International Conference on Machine Learning,ICML 2016.New York:JMLR.org,2016:2071-2080.
[11]KAZEMI S M, POOLE D.SimplE Embedding for Link Predic-tion in Knowledge Graphs[C]//Advances in Neural Information Processing Systems 31:Annual Conference on Neural Information Processing Systems 2018.Canada:NeurIPS 2018,2018:4289-4300.
[12]IVANA B,CARL ALLEN C,TIMOTHY M.HypernetworkKnowledge Graph Embeddings [C]//Artificial Neural Networks and Machine Learning--ICANN 2019--28th International Conference on Artificial Neural Networks.Germany:Springer,2019:553-565.
[13]IVANA B,CARL ALLEN C,TIMOTHY M.TuckER:Tensor Factorization for Knowledge Graph Completion [C]//Procee-dings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.Hong Kong:EMNLP-IJCNLP,2019:5184-5193.
[14]SETH L,SCHULD M,IJAZ A,et al.Quantum Embeddings for Machine Learning[J].arXiv:2001.03622,2020.
[15]GLOROT X,BENGIO Y.Understanding the Difficulty of Trai-ning Deep Feedforward Neural Networks [C]//Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.JMLR Workshop,2010:249-256.
[16]WANG Y J,GEMULLA R,LI H.On Multi-Relational LinkPrediction with Bilinear Models [C]//AAAI.2018.
[17]BOLLACKER K D,EVANS C,PARITOSH P,et al.Freebase:A Collaboratively Created Graph Database for Structuring Human Knowledge [C]//Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data.New York:SIGMOD 2008:1247-1250.
[18]TOUTANOVA K,CHEN D Q.Observed Versus Latent Features for Knowledge Base and Text Inference[C]//The 3rd Workshop on Continuous Vector Space Models and their Compositionality.2015.
[19]MILLER G A.WordNet:A Lexical Database for English [J].Communications of the ACM,1992,38(1):39-41.
[20]DETTMERS T,MINERVINI P,STENETORP P,et al.Convolutional 2d Knowledge Graph Embeddings [C]//Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence,(AAAI-18),the 30th Innovative Applications of Artificial Intelligence(IAAI-18),and the 8th AAAI Symposium on Educa-tional Advances in Artificial Intelligence(EAAI-18).New Or-leans,Louisiana,USA,AAAI Press,2018:1811-1818.
[21]ADAM P,GROSS S,CHINTALA S,et al.Automatic Differentiation in PyTorch [C]//NIPS.2017.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!