Computer Science ›› 2023, Vol. 50 ›› Issue (3): 12-22.doi: 10.11896/jsjkx.220700111

• Special Issue of Knowledge Engineering Enabled By Knowledge Graph: Theory, Technology and System • Previous Articles     Next Articles

Knowledge Graph-to-Text Model Based on Dynamic Memory and Two-layer Reconstruction Reinforcement

MA Tinghuai1, SUN Shengjie1, RONG Huan2, QIAN Minfeng1   

  1. 1 School of Computer and Software,Nanjing University of Information Science and Technology,Nanjing 210044,China
    2 School of Artificial Intelligence(School of Future Technology),Nanjing University of Information Science and Technology,Nanjing 210044, China
  • Received:2022-07-10 Revised:2022-12-06 Online:2023-03-15 Published:2023-03-15
  • About author:MA Tinghuai,born in 1974,Ph.D,professor.His main research interests include social network privacy protection,big datamining,text emotion computing,etc.
    RONG Huan,born in 1990,Ph.D,lecturer.His mian research interests include social media mining,content security on social network,knowledge engineering,etc.
  • Supported by:
    National Natural Science Foundation of China(62102187),Natural Science Foundation of Jiangsu Province(Basic Research Program)(BK20210639) and National Key Research and Development Program of China(2021YFE0104400).

Abstract: Knowledge Graph-to-Text is a new task in the field of knowledge graph,which aims to transform knowledge graph into readable text describing these knowledge.With the deepening of research in recent years,the generation technology of Graph-to-Text has been applied to the fields of product review generation,recommendation explanation generation,paper abstract generation and so on.The translation model in the existing methods adopts the method of first-plan-then-realization,which fails to dynamically adjust the planning according to the generated text and does not track the static content planning,resulting in incohe-rent semantics before and after the text.In order to improve the semantic coherence of generated text,a Graph-to-Text model based on dynamic memory and two-layer reconstruction enhancement is proposed in this paper.Through three stages of static content planning,dynamic content planning and two-layer reconstruction mechanism,this model makes up for the structural difference between knowledge graph and text,focusing on the content of each triple while generating text.Compared with exis-ting generation models,this model not only compensates for the structural differences between knowledge graphs and texts,but also improves the ability to locate key entities,resulting in stronger factual consistency and semantics in the generated texts.In this paper,experiments are conducted on the WebNLG dataset.The results show that,compared with the current exis-ting models in the task of Graph-to-Text,the proposed model generates more accurate content planning.The logic between the sentences of the generated text is more reasonable and the correlation is stronger.The proposed model outperforms existing methods on me-trics such as BLEU,METEOR,ROUGE,CHRF++,etc.

Key words: Knowledge Graph-to-Text, Natural language generation, Memory network, Reconstruction mechanism, Structured data

CLC Number: 

  • TP319.1
[1]JI S,PAN S,CAMBRIA E,et al.A survey on knowledgegraphs:Representation,acquisition,and applications[J].IEEE Transactions on Neural Networks and Learning Systems,2021,33(2):494-514.
[2]LIU Q,LI Y,DUAN H,et al.Knowledge Graph Construction Techniques[J].Journal of Computer Research and Development,2016,53(3):582-600.
[3]WYLOT M,HAUSWIRTH M,CUDRÉ-MAUROUX P,et al.RDF data storage and query processing schemes:A survey[J].ACM Computing Surveys(CSUR),2018,51(4):1-36.
[4]PUDUPPULLY R,DONG L,LAPATA M.Data-to-text generation with content selection and planning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:6908-6915.
[5]ZHANG X B,GONG H G,YANG F,et al.Chinese sentence-level lip reading based on end-to-end model[J].Journal of Software,2020,31(6):1747-1760.
[6]XU F,LUO J,WANG M,et al.Speech-driven end-to-end language discrimination toward chinese dialects[J].ACM Transactions on Asian and Low-Resource Language Information Processing(TALLIP),2020,19(5):1-24.
[7]GARDENT C,SHIMORINA A,NARAYAN S,et al.TheWebNLG challenge:Generating text from RDF data[C]//Proceedings of the 10th International Conference on Natural Language Generation.2017:124-133.
[8]WISEMAN S J,SHIEBER S M,RUSH A S M.Challenges in Data-to-Document Generation[C]//Proceedings of the Confe-rence on Empirical Methods in Natural Language Processing(EMNLP2017).Association for ComputationalLinguistics,2017.
[9]KUKICH K.Design of a knowledge-based report generator[C]//21st Annual Meeting of the Association for Computa-tional Linguistics.1983:145-150.
[10]BAO J,TANG D,DUAN N,et al.Table-to-text:Describing table region with natural language[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2018.
[11]BANARESCU L,BONIAL C,CAI S,et al.Abstract meaning representation for sembanking[C]//Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse.2013:178-186.
[12]DISTIAWAN B,QI J,ZHANG R,et al.GTR-LSTM:A triple encoder for sentence generation from RDF data[C]//Procee-dings of the 56th Annual Meeting of the Association for Computational Linguistics(Volume 1:Long Papers).2018:1627-1637.
[13]LI W,PENG R,WANG Y,et al.Knowledge graph based natural language generation with adapted pointer-generator networks[J].Neurocomputing,2020,382(1):174-187.
[14]CHEN W,SU Y,YAN X,et al.KGPT:Knowledge-Grounded Pre-Training for Data-to-Text Generation[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing(EMNLP).2020:8635-8648.
[15]GAO H,WU L,HU P,et al.RDF-to-text generation withgraph-augmented structural neural encoders[C]//Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence.2021:3030-3036.
[16]XU B B,CEN K T,HUANG J J,et al.A survey on Graph Convolution Neural Network[J].Chinese Journal of Computers,2020,43(5):755-780.
[17]GUO Z,ZHANG Y,TENG Z,et al.Densely connected graphconvolutional networks for graph-to-sequence learning[J].Transactions of the Association for Computational Linguistics,2019,7(1):297-312.
[18]BECK D,HAFFARI G,COHN T.Graph-to-sequence learning using Gated Graph Neural Networks[C]//Annual Meeting of the Association of Computational Linguistics 2018.Association for Computational Linguistics(ACL).2018:273-283.
[19]KONCEL-KEDZIORSKI R,BEKAL D,LUAN Y,et al.TextGeneration from Knowledge Graphs with Graph Transformers[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,Volume 1(Long and Short Papers).2019:2284-2293.
[20]SHI Y,LUO Z,ZHU P,et al.G2T:Generating Fluent Descriptions for Knowledge Graph[C]//Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval.2020:1861-1864.
[21]GAO H,WU L,HU P,et al.RDF-to-text generation withgraph-augmented structural neural encoders[C]//Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence.2021:3030-3036.
[22]RIBEIRO L F R,ZHANG Y,GARDENT C,et al.Modeling global and local node contexts for text generation from know-ledge graphs[J].Transactions of the Association for Computational Linguistics,2020,8(1):589-604.
[23]XU F,DAN Y,YAN K,et al.Low-Resource Language Discrimination toward Chinese Dialects with Transfer Learning and Data Augmentation[J].Transactions on Asian and Low-Resource Language Information Processing,2021,21(2):1-21.
[24]RIBEIRO L F R,SCHMITT M,SCHÜTZE H,et al.Investigating Pretrained Language Models for Graph-to-Text Generation[C]//Proceedings of the 3rd Workshop on Natural Language Processing for Conversational AI.2021:211-227.
[25]KALE M,RASTOGI A.Text-to-Text Pre-Training for Data-to-Text Tasks[C]//Proceedings of the 13th International Confe-rence on Natural Language Generation.2020:97-102.
[26]LI J,TANG T,ZHAO W X,et al.Few-shot Knowledge Graph-to-Text Generation with Pretrained Language Models[C]//Findings of the Association for Computational Linguistics:ACL-IJCNLP 2021.2021:1558-1568.
[27]KURITA K,MICHEL P,NEUBIG G.Weight Poisoning At-tacks on Pretrained Models[C]//Proceedings of the 58th An-nual Meeting of the Association for Computational Linguistics.2020:2793-2806.
[28]MIAO N,SONG Y,ZHOU H,et al.Do you have the right scissors? Tailoring Pre-trained Language Models via Monte-Carlo Methods[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:3436-3441.
[29]FERREIRA T C,VAN DER LEE C,VAN MILTENBURG E,et al.Neural data-to-text generation:A comparison between pipeline and end-to-end architectures[C]//2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.Association for Computational Linguistics,2019:552-562.
[30]CHU X M,ZHU Q M,ZHOU G D.Discourse Primary-Secon-dary Relationships in Natural Language Processing[J].Chinese Journal of Computers,2017,40(4):842-860.
[31]MORYOSSEF A,GOLDBERG Y,DAGAN I.Step-by-Step:Separating Planning from Realization in Neural Data-to-Text Generation[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,Volume 1(Long and Short Papers).2019:2267-2277.
[32]ZHAO C,WALKER M,CHATURVEDI S.Bridging the structural gap between encoding and decoding for data-to-text gene-ration[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:2481-2491.
[33]PUDUPPULLY R,DONG L,LAPATA M.Data-to-text generation with content selection and planning[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2019:6908-6915.
[34]PUDUPPULLY R,LAPATA M.Data-to-text Generation with Macro Planning[J].Transactions of the Association for Computational Linguistics,2021,9(1):510-527.
[35]SHAO Z,HUANG M,WEN J,et al.Long and Diverse Text Generation with Planning-based Hierarchical Variational Model[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing(EMNLP-IJC-NLP).2019:3257-3268.
[36]SU Y,VANDYKE D,WANG S,et al.Plan-then-Generate:Controlled Data-to-Text Generation via Planning[C]//Findings of the Association for Computational Linguistics:EMNLP 2021.2021:895-909.
[37]LEWIS M,LIU Y,GOYAL N,et al.BART:Denoising Se-quence-to-Sequence Pre-training for Natural Language Generation,Translation,and Comprehension[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:7871-7880.
[38]CHEN K,LI F,HU B,et al.Neural Data-to-Text Generation with Dynamic Content Planning[J].arxiv:2004.07426,2020.
[39]ISO H,UEHARA Y,ISHIGAKI T,et al.Learning to select,track,and generate for data-to-text[J].Journal of Natural Language Processing,2020,27(3):599-626.
[40]PUDUPPULLY R,DONG L,LAPATA M.Data-to-text Gene-ration with Entity Modeling[C]//Proceedings of the 57th An-nual Meeting of the Association for Computational Linguistics.2019:2023-2035.
[41]SHEN X,CHANG E,SU H,et al.Neural Data-to-Text Generation via Jointly Learning the Segmentation and Correspondence[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:7155-7165.
[42]GEHRMANN S,ADEWUMI T,AGGARWAL K,et al.TheGEM Benchmark:Natural Language Generation,its Evaluation and Metrics[C]//1st Workshop on Natural Language Generation,Evaluation,and Metrics 2021.Association for Computational Linguistics,2021:96-120.
[1] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[2] WANG Xin-tong, WANG Xuan, SUN Zhi-xin. Network Traffic Anomaly Detection Method Based on Multi-scale Memory Residual Network [J]. Computer Science, 2022, 49(8): 314-322.
[3] YU Jia-qi, KANG Xiao-dong, BAI Cheng-cheng, LIU Han-qing. New Text Retrieval Model of Chinese Electronic Medical Records [J]. Computer Science, 2022, 49(6A): 32-38.
[4] WANG Shan, XU Chu-yi, SHI Chun-xiang, ZHANG Ying. Study on Cloud Classification Method of Satellite Cloud Images Based on CNN-LSTM [J]. Computer Science, 2022, 49(6A): 675-679.
[5] LIU Bao-bao, YANG Jing-jing, TAO Lu, WANG He-ying. Study on Prediction of Educational Statistical Data Based on DE-LSTM Model [J]. Computer Science, 2022, 49(6A): 261-266.
[6] GAO Yan-lu, XU Yuan, ZHU Qun-xiong. Predicting Electric Energy Consumption Using Sandwich Structure of Attention in Double -LSTM [J]. Computer Science, 2022, 49(3): 269-275.
[7] PAN Zhi-hao, ZENG Bi, LIAO Wen-xiong, WEI Peng-fei, WEN Song. Interactive Attention Graph Convolutional Networks for Aspect-based Sentiment Classification [J]. Computer Science, 2022, 49(3): 294-300.
[8] WANG Kai, LI Zhou-jun, SHENG Wen-bo, CHEN Shu-wei, WANG Ming-xuan, LIU Jian-qing, LAN Hai-bo, ZHANG Rui. Multi-turn Dialogue Technology and Its Application in Power Grid Data Query [J]. Computer Science, 2022, 49(10): 265-271.
[9] YAO Dong, LI Zhou-jun, CHEN Shu-wei, JI Zhen, ZHANG Rui, SONG Lei, LAN Hai-bo. Task-oriented Dialogue System and Technology Based on Deep Learning [J]. Computer Science, 2021, 48(5): 232-238.
[10] PENG Bin, LI Zheng, LIU Yong, WU Yong-hao. Automatic Code Comments Generation Method Based on Convolutional Neural Network [J]. Computer Science, 2021, 48(12): 117-124.
[11] JING Li, HE Ting-ting. Chinese Text Classification Model Based on Improved TF-IDF and ABLCNN [J]. Computer Science, 2021, 48(11A): 170-175.
[12] ZHANG Ning, FANG Jing-wen, ZHAO Yu-xuan. Bitcoin Price Forecast Based on Mixed LSTM Model [J]. Computer Science, 2021, 48(11A): 39-45.
[13] CHEN De, SONG Hua-zhu, ZHANG Juan, ZHOU Hong-lin. Entity Recognition Fusing BERT and Memory Networks [J]. Computer Science, 2021, 48(10): 91-97.
[14] ZHANG Yu-shuai, ZHAO Huan, LI Bo. Semantic Slot Filling Based on BERT and BiLSTM [J]. Computer Science, 2021, 48(1): 247-252.
[15] ZHAO Xiao-dong, SU Gong-Jin, LI Ke-li, CHENG Jie and XU Jiang-feng. Spectrum Occupancy Prediction Model Based on EMD Decomposition and LSTM Networks [J]. Computer Science, 2020, 47(6A): 294-298.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!