Computer Science ›› 2023, Vol. 50 ›› Issue (8): 170-176.doi: 10.11896/jsjkx.220600070

• Artificial Intelligence • Previous Articles     Next Articles

Answer Extraction Method for Reading Comprehension Based on Frame Semantics and GraphStructure

YANG Zhizhuo1, XU Lingling1, Zhang Hu1, LI Ru1,2   

  1. 1 School of Computer and Information Technology,Shanxi University,Taiyuan 030006,China
    2 Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education,Shanxi University,Taiyuan 030006,China
  • Received:2022-06-08 Revised:2022-10-10 Online:2023-08-15 Published:2023-08-02
  • About author:YANG Zhizhuo,born in 1983,Ph.D,associate professor,is a member of China Computer Federation.His main research interests include natural language processing and reading comprehension Q&A.
  • Supported by:
    National Natural Science Foundation of China(61936012,62176145) and Shanxi Province Basic Research Program General Fund Project(20210302123469).

Abstract: Machine reading comprehension is one of the most challenging tasks in the field of natural language processing.With the continuous development of deep learning technology and the release of large-scale MRC datasets,the performance of MRC models keep breaking records.However,the previous models still have shortcomings in logical reasoning and deep semantic understanding.In order to solve the above problems,this paper proposes a reading comprehension answer extraction method based on frame semantics and graph structure.This method first uses Chinese FrameNet to match candidate sentences related to question semantics.Secondly,the entities in the questions and candidate sentences are extracted,and the entity relationship graph is constructed based on the dependent syntax and semantic relationships of entities in the sentences.Finally,the entity relationship graph is introduced into the graph attention network for logical reasoning,so as to realize the extraction of reading comprehension answers.Experiment results on Dureader-robust dataset show that the proposed method achieves better results than the baseline model.

Key words: Machine reading comprehension, CFN, Entity relationship graph, Graph attention network, Syntactic relations

CLC Number: 

  • TP391
[1]HERMANN K M,TOMÁVS K,GREFENSTETTE E,et al.Teaching Machines to Read and Comprehend[C]//Proceedings of the 29st International Conference on Neural Information Processing Systems.New York:Curran Associate Inc,2015:1693-1701.
[2]KADLEC R,SCHMID M,BA JGAR O,et al.Text Understan-ding with the Attention Sum Reader Network[C]//Proceedings of the 54st Annual Meeting of the Association for Computa-tional Linguistics.USA:Association for Computational Linguistics,2016:908-918.
[3]DHINGRA B,LIU H,YANG Z,et al.Gated-attention readers for text comprehension[J].arXiv:1606.01549,2016.
[4]SEO M,KEMBHAVI A,FARHADI A,et al.Bidirectional Attention Flow for Machine Comprehension[J].arXiv:1611.01603,2016.
[5]WANG W,YANG N,WEI F,et al.Gated self-matching net-works for reading comprehension and question answering[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers).2017:189-198.
[6]YU A W,DOHAND,LUONG M T,et al.QANet:Combining Local Convolution with Global Self-Attention for Reading Comprehension[C]//Proceedings of International Conference on Learning Representations(ICLR2018).UAS:ICLR Organizing Committee,2018:1-8.
[7]RAJPURKAR P,ZHANG J,LOPYREV K,et al.SQuAD:100 000+Questions for Machine Comprehension of Text[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing.USA:Association for Computational Linguistics,2016:2383-2392.
[8]DUNN M,SAGUN L,HIGGINS M,et al.SearchQA:A New Q&A Dataset Augmented with Context from a Search Engine[J].arXiv:1704.05179,2017.
[9]TANG H,LI H,LIU J,et al.DuReader_robust:A ChineseDataset Towards Evaluating Robustness and Generalization of Machine Reading Comprehension in Real-World Applications[J].arXiv:2004.11142,2020.
[10]VELIKOVI P,CUCURULL G,CASANOVA A,et al.GraphAttention Networks[C]//Proceedings of International Confe-rence on Learning Representations(ICLR 2018).USA:ICLR Organizing Committee,2018:1-11.
[11]PENG H,LI J,HE Y,et al.Large-scale hierarchical text classification with recursively regularized deep graph-cnn[C]//Proceedings of the 2018 World Wide Web Conference.2018:1063-1072.
[12]SONG L,WANG Z,YU M,et al.Exploring graph-structured passage representation for multi-hopreading comprehension with graph neural networks [J].arXiv:1809.02040,2018.
[13]DE CAO N,AZIZW,TITOV I.Question answering by reaso-ning across documents with graph convolutional networks[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,Volume 1.2019:2306-2317.
[14]FANG Y,SUN S,GAN Z,et al.Hierarchical Graph Network for Multi-hop Question Answering[J].arXiv:1911.03631,2019.
[15]XIAO Y,QU Y,QIU L,et al.Dynamically Fused Graph Network for Multi-hop Reasoning[C]//Proceedings of the 57st Annual Meeting of the Association for Computational Linguistics.2019:6140-6150.
[16]LI R.Research on Frame Semantic Structure Analysis Techno-logy for Chinese Sentences[D].Taiyuan:Shanxi University,2012.
[17]LIU M.Sentence similarity calculation based on word vector and its application in case-based machinetranslation [D].Beijing:Beijing Institute of Technology,2015.
[18]FILLMORE C J.Frame semantics and the nature of language[J].Annals of the New York Academy of Sciences:Conference on the Origin and Development of Language and Speech,1976,280(1):20-32.
[19]BAKER C F,FILLMORE C J,LOWE J B.The BerkeleyFrameNet Project[C]//Proceedings of the 17th International Conference on Computational Linguistcs-Volume 1.Association for Computational Linguistics,2002:86-90.
[20]MANNING C D,SURDEANU M,BAUER J,et al.The Stanford CoreNLP Natural LanguageProcessing Toolkit[C]//Proceedings of 52nd Annual Meeting of the Association for Computational Linguistics:System Demonstrations.2014:55-60.
[21]LIU Q,LI S J.Vocabulary semantic similarity calculation based on HowNet[J].Chinese Computational Linguistics,2002,7(2):59-76.
[22]DEVLIN J,CHANG M W,LEE K,et al.BERT:Pre-training of Deep Bidirectional Transformers for Language Understanding[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.USA:Association for Computational Linguistics,2019:4171-4186
[23]HOCHREITER S,SCHMIDHUBER J.Long Short-Term Me-mory[J].Neural Computation,1997,9(8):1735-1780.
[24]LAN Z,CHEN M,GOODMAN S,et al.ALBERT:a lite BERT for self-supervised learning of language representations [EB/OL].(2020-02-09) [2021-06-07].https://arxiv.org/abs/1909.11942v6.
[25]SUN Y,WANG S H,LI Y K,et al.ERNIE:enhanced representation through knowledge integration[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.USA:Association for Computational Linguistics,2019:1441-1451.
[26]CUI Y M,CHE W X,LIU T,et al.Pre-training with whole word masking for Chinese BERT [EB/OL].(2019-10-29) [2021-06-07].https://arxiv.org/abs/1906.08101v2.
[27]LIU Y,OTT M,GOYAL N,et al.RoBERTa:a robustly optimized BERT pretraining approach [EB/OL].(2019-07-26) [2021-06-07].https://arxiv.org/abs/1907.11692v1.
[28]YANG Z L,DAI Z H,YANG Y M,et al.XLNet:generalized autoregressive pretraining for language understanding[C]//Proceedings of 33rd Conference on Neural Information Proces-sing Systems(NeurIPS 2019).Canada:Neural Information Processing Systems Foundation,Inc,2019:5754-5764.
[1] ZHANG Tao, CHENG Yifei, SUN Xinxu. Graph Attention Networks Based on Causal Inference [J]. Computer Science, 2023, 50(6A): 220600230-9.
[2] DONG Yongfeng, HUANG Gang, XUE Wanruo, LI Linhao. Graph Attention Deep Knowledge Tracing Model Integrated with IRT [J]. Computer Science, 2023, 50(3): 173-180.
[3] LIU Luping, ZHOU Xin, CHEN Junjun, He Xiaohai, QING Linbo, WANG Meiling. Event Extraction Method Based on Conversational Machine Reading Comprehension Model [J]. Computer Science, 2023, 50(2): 275-284.
[4] TAN Ying-ying, WANG Jun-li, ZHANG Chao-bo. Review of Text Classification Methods Based on Graph Convolutional Network [J]. Computer Science, 2022, 49(8): 205-216.
[5] SHI Dian-xi, ZHAO Chen-ran, ZHANG Yao-wen, YANG Shao-wu, ZHANG Yong-jun. Adaptive Reward Method for End-to-End Cooperation Based on Multi-agent Reinforcement Learning [J]. Computer Science, 2022, 49(8): 247-256.
[6] ZHANG Hu, BAI Ping. Graph Convolutional Networks with Long-distance Words Dependency in Sentences for Short Text Classification [J]. Computer Science, 2022, 49(2): 279-284.
[7] FU Kun, GUO Yun-peng, ZHUO Jia-ming, LI Jia-ning, LIU Qi. Semantic Information Enhanced Network Embedding with Completely Imbalanced Labels [J]. Computer Science, 2022, 49(11): 109-116.
[8] ZENG Wei-liang, CHEN Yi-hao, YAO Ruo-yu, LIAO Rui-xiang, SUN Wei-jun. Application of Spatial-Temporal Graph Attention Networks in Trajectory Prediction for Vehicles at Intersections [J]. Computer Science, 2021, 48(6A): 334-341.
[9] QIU Jia-zuo, XIONG De-yi. Frontiers in Neural Question Generation:A Literature Review [J]. Computer Science, 2021, 48(6): 159-167.
[10] DU Shao-hua, WAN Huai-yu, WU Zhi-hao, LIN You-fang. Customs Commodity HS Code Classification Integrating Text Sequence and Graph Information [J]. Computer Science, 2021, 48(4): 97-103.
[11] LIU Zhi-xin, ZHANG Ze-hua, ZHANG Jie. Top-N Recommendation Method for Graph Attention Based on Multi-level and Multi-view [J]. Computer Science, 2021, 48(4): 104-110.
[12] LI Zhou-jun,WANG Chang-bao. Survey on Deep-learning-based Machine Reading Comprehension [J]. Computer Science, 2019, 46(7): 7-12.
[13] LIU Fei-long, HAO Wen-ning, CHEN Gang, JIN Da-wei and SONG Jia-xing. Attention of Bilinear Function Based Bi-LSTM Model for Machine Reading Comprehension [J]. Computer Science, 2017, 44(Z6): 92-96.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!