计算机科学 ›› 2023, Vol. 50 ›› Issue (12): 229-235.doi: 10.11896/jsjkx.230500010

• 人工智能 • 上一篇    下一篇

融合关系传递信息的双图文档级关系抽取方法

寇嘉颖, 赵卫东, 柳先辉   

  1. 同济大学电子与信息工程学院 上海 201804
  • 收稿日期:2023-05-04 修回日期:2023-08-30 出版日期:2023-12-15 发布日期:2023-12-07
  • 通讯作者: 柳先辉(xianhui_l@163.com)
  • 作者简介:(jiayingk@tongji.edu.cn)
  • 基金资助:
    国家重点研发计划(2020YFB1709303)

Method of Document Level Relation Extraction Based on Fusion of Relational Transfer Information Using Double Graph

KOU Jiaying, ZHAO Weidong, LIU Xianhui   

  1. College of Electronics and Information Engineering,Tongji University,Shanghai 201804,China
  • Received:2023-05-04 Revised:2023-08-30 Online:2023-12-15 Published:2023-12-07
  • About author:KOU Jiaying,born in 1999,postgra-duate.Her main research interests include relation extraction and information extraction.
    LIU Xianhui,born in 1979,Ph.D,associate professor.His man research intere-sts include machine learning,data mi-ning and big data,networked manufacturing.
  • Supported by:
    National Key Research and Development Program of China(2020YFB1709303).

摘要: 文档级关系抽取指在长段落的非结构性文本中抽取实体以及实体之间的关系。相较于传统的句子级关系抽取,文档级关系抽取需要融合多个句子的上下文信息,并且加以逻辑推理,才能抽取出关系三元组。针对目前文档级关系抽取方法中存在的文档语义信息建模不够完整且抽取效果具有局限性等问题,提出了一种融合关系传递信息的双图文档级关系抽取方法。利用关系信息传递性将不同句子中提及之间的交互信息引入路径构造中,加以使用同句子中提及的交互信息以及提及之间的共指信息,构建提及节点间的路径集合,提高文档建模的完整性;应用路径集合和提及节点搭建提及层次的图聚合网络,建立文档语义信息模型;经过图卷积网络的信息迭代后,将相同实体的不同提及节点的信息进行融合,形成实体节点,构成实体层次的图推理网络;最终根据实体图节点间的路径信息进行逻辑推理,抽取出实体间的关系。在公开数据集DocRED上的实验结果表明,相对于基准模型,所提模型的F1值有1.2的提升,证明了该方法的有效性。

关键词: 文档级关系抽取, 关系传递, 图卷积神经网络, 逻辑推理

Abstract: Document-level relation extraction refers to the extraction of entities and their relations from long paragraphs of unstructured text.Compared to traditional sentence-level relation extraction,document-level relation extraction requires the integration of contextual information from multiple sentences and logical reasoning to extract relation triples.In response to the current limitations of document-level relation extraction methods,such as incomplete modeling of document semantic information and li-mited extraction effects,a double-graph document-level relation extraction method that integrates relational transfer information is proposed.Interactions mentioned between different sentences are introduced into the path construction through the transitivity of relational information,and the interaction information mentioned in the same sentence as well as the coreference information between mentions are used to construct the path set between mention nodes,so as to improve the completeness of document modeling.A mention hierarchy graph aggregation network is constructed using the path set and mention nodes,and a document semantic information model is established.After the information iteration of the graph convolutional network(GCN),the information of different mention nodes of the same entity is fused to form entity node,which constitutes an entity-level graph reasoning network.Finally,logical inference is performed based on the path information between entity graph nodes to extract the relation between entities.The proposed model is experimented on the public dataset DocRED(document-level relation extraction dataset),and the experimental results show a improvement of 1.2(F1) compared to the baseline model,which proves the effectiveness of the proposed method.

Key words: Document relation extraction, Relational transfer, Graph convolutional network, Logical reasoning

中图分类号: 

  • TP391
[1]HAN X,GAO T,LIN Y,et al.More data,more relations,more context and more openness:A review and outlook for relation extraction[J].arXiv:2004.03186,2020.
[2]NAYAK T,MAJUMDER N,GOYAL P,et al.Deep neural approaches to relation triplets extraction:A comprehensive survey[J].Cognitive Computation,2021,13:1215-1232.
[3]MIWA M,BANSAL M.End-to-end relation extraction usinglstms on sequences and tree structures[J].arXiv:1601.00770,2016.
[4]YAN Y,SUN H,LIU J.A Review and Outlook for Relation Extraction[C]//Proceedings of the 5th International Conference on Computer Science and Application Engineering.2021.
[5]WANG X,WANG Z,SUN W,et al.Enhancing Document-Level Relation Extraction by Entity Knowledge Injection[C]//21st International Semantic Web Conference,Virtual Event.Cham:Springer International Publishing.2022:39-56.
[6]ZHANG N,CHEN X,XIE X,et al.Document-level relation extraction as semantic segmentation[J].arXiv:2106.03618,2021.
[7]XU B,WANG Q,LYU Y,et al.Entity structure within and throughout:Modeling mention dependencies for document-level relation extraction[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2021:14149-14157.
[8]TAN Q,HE R,BING L,et al.Document-level relation extraction with adaptive focal loss and knowledge distillation[J].ar-Xiv:2203.10900,2022.
[9]GIORGI J,BADER G D,WANG B.A sequence-to-sequence approach for document-level relation extraction[J].arXiv:2204.01098,2022.
[10]PENG X,ZHANG C,XU K.Document-level Relation Extrac-tion via Subgraph Reasoning[C]//Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence(IJCAI).2022:4331-4337.
[11]WU T,KONG F.Document-Level Relation Extraction Based on Graph Attention Convolutional Neural Network[J].Journal of Chinese Information Processing,2021,35(10):73-80.
[12]CHEN H,HONG P,HAN W,et al.Dialogue relation extraction with document-level heterogeneous graph attention networks[J].Cognitive Computation,2023,15(2):793-802.
[13]SAHU S K,CHRISTOPOULOU F,MIWA M,et al.Inter-sentence relation extraction with document-level graph convolu-tional neural network[J].arXiv:1906.04684,2019.
[14]CHRISTOPOULOU F,MIWA M,ANANIADOU S.Connec-ting the dots:Document-level neural relation extraction with edge-oriented graphs[J].arXiv:1909.00228,2019.
[15]NAN G,GUO Z,SEKULIĆ I,et al.Reasoning with latent structure refinement for document-level relation extraction[J].ar-Xiv:2005.06312,2020.
[16]ZENG S,XU R,CHANG B,et al.Double graph based reasoning for document-level relation extraction[J].arXiv:2009.13752,2020.
[17]LAMPLE G,BALLESTEROS M,SUBRAMANIAN S,et al.Neural architectures for named entity recognition[J].arXiv:1603.01360,2016.
[18]ZHOU J,CUI G,HU S,et al.Graph neural networks:A review of methods and applications[J].AI Open,2020,1:57-81.
[19]XU W,CHEN K,ZHAO T.Discriminative reasoning for document-level relation extraction[J].arXiv:2106.01562,2021.
[20]VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[J].Advances in Neural Information Processing Systems,2017,30:5998-6008.
[21]VERGA P,STRUBELL E,MCCALLUM A.Simultaneouslyself-attending to all mentions for full-abstract biological relation extraction[J].arXiv:1802.10569,2018.
[22]YAO Y,YE D,LI P,et al.DocRED:A large-scale document-le-vel relation extraction dataset[J].arXiv:1906.06127,2019.
[23]ZHU H,LIN Y,LIU Z,et al.Graph neural networks with generated parameters for relation extraction[J].arXiv:1902.00756,2019.
[24]IMAMBI S,PRAKASH K B,KANAGACHIDAMBARESANG.PyTorch[M]//Programming with TensorFlow.Springer,2021:87-104.
[25]LOSHCHILOV I,HUTTER F.Decoupled weight decay regularization[J].arXiv:1711.05101,2017.
[26]PENNINGTON J,SOCHER R,MANNING C D.Glove:Global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Proces-sing(EMNLP).2014:1532-1543.
[27]DEVLIN J,CHANG M W,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[J].arXiv:1810.04805,2018.
[28]WANG L,CAO Z,DE MELO G,et al.Relation classification via multi-level attention cnns[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(Vo-lume 1:Long Papers).2016:1298-1307.
[29]HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780.
[30]YE D,LIN Y,DU J,et al.Coreferential reasoning learning for language representation[J].arXiv:2004.06870,2020.
[31]YU Y,SI X,HU C,et al.A review of recurrent neural networks:LSTM cells and network architectures[J].Neural computation,2019,31(7):1235-1270.
[32]GUO Z,ZHANG Y,LU W.Attention guided graph convolu-tional networks for relation extraction[J].arXiv:1906.07510,2019.
[33]TANG H,CAO Y,ZHANG Z,et al.HIN:Hierarchical infe-rence network for document-level relation extraction[C]//Advances in Knowledge Discovery and Data Mining:24th Pacific-Asia Conference(PAKDD 2020).Singapore.Springer International Publishing,2020:197-209.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!