计算机科学 ›› 2023, Vol. 50 ›› Issue (12): 229-235.doi: 10.11896/jsjkx.230500010
寇嘉颖, 赵卫东, 柳先辉
KOU Jiaying, ZHAO Weidong, LIU Xianhui
摘要: 文档级关系抽取指在长段落的非结构性文本中抽取实体以及实体之间的关系。相较于传统的句子级关系抽取,文档级关系抽取需要融合多个句子的上下文信息,并且加以逻辑推理,才能抽取出关系三元组。针对目前文档级关系抽取方法中存在的文档语义信息建模不够完整且抽取效果具有局限性等问题,提出了一种融合关系传递信息的双图文档级关系抽取方法。利用关系信息传递性将不同句子中提及之间的交互信息引入路径构造中,加以使用同句子中提及的交互信息以及提及之间的共指信息,构建提及节点间的路径集合,提高文档建模的完整性;应用路径集合和提及节点搭建提及层次的图聚合网络,建立文档语义信息模型;经过图卷积网络的信息迭代后,将相同实体的不同提及节点的信息进行融合,形成实体节点,构成实体层次的图推理网络;最终根据实体图节点间的路径信息进行逻辑推理,抽取出实体间的关系。在公开数据集DocRED上的实验结果表明,相对于基准模型,所提模型的F1值有1.2的提升,证明了该方法的有效性。
中图分类号:
[1]HAN X,GAO T,LIN Y,et al.More data,more relations,more context and more openness:A review and outlook for relation extraction[J].arXiv:2004.03186,2020. [2]NAYAK T,MAJUMDER N,GOYAL P,et al.Deep neural approaches to relation triplets extraction:A comprehensive survey[J].Cognitive Computation,2021,13:1215-1232. [3]MIWA M,BANSAL M.End-to-end relation extraction usinglstms on sequences and tree structures[J].arXiv:1601.00770,2016. [4]YAN Y,SUN H,LIU J.A Review and Outlook for Relation Extraction[C]//Proceedings of the 5th International Conference on Computer Science and Application Engineering.2021. [5]WANG X,WANG Z,SUN W,et al.Enhancing Document-Level Relation Extraction by Entity Knowledge Injection[C]//21st International Semantic Web Conference,Virtual Event.Cham:Springer International Publishing.2022:39-56. [6]ZHANG N,CHEN X,XIE X,et al.Document-level relation extraction as semantic segmentation[J].arXiv:2106.03618,2021. [7]XU B,WANG Q,LYU Y,et al.Entity structure within and throughout:Modeling mention dependencies for document-level relation extraction[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2021:14149-14157. [8]TAN Q,HE R,BING L,et al.Document-level relation extraction with adaptive focal loss and knowledge distillation[J].ar-Xiv:2203.10900,2022. [9]GIORGI J,BADER G D,WANG B.A sequence-to-sequence approach for document-level relation extraction[J].arXiv:2204.01098,2022. [10]PENG X,ZHANG C,XU K.Document-level Relation Extrac-tion via Subgraph Reasoning[C]//Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence(IJCAI).2022:4331-4337. [11]WU T,KONG F.Document-Level Relation Extraction Based on Graph Attention Convolutional Neural Network[J].Journal of Chinese Information Processing,2021,35(10):73-80. [12]CHEN H,HONG P,HAN W,et al.Dialogue relation extraction with document-level heterogeneous graph attention networks[J].Cognitive Computation,2023,15(2):793-802. [13]SAHU S K,CHRISTOPOULOU F,MIWA M,et al.Inter-sentence relation extraction with document-level graph convolu-tional neural network[J].arXiv:1906.04684,2019. [14]CHRISTOPOULOU F,MIWA M,ANANIADOU S.Connec-ting the dots:Document-level neural relation extraction with edge-oriented graphs[J].arXiv:1909.00228,2019. [15]NAN G,GUO Z,SEKULIĆ I,et al.Reasoning with latent structure refinement for document-level relation extraction[J].ar-Xiv:2005.06312,2020. [16]ZENG S,XU R,CHANG B,et al.Double graph based reasoning for document-level relation extraction[J].arXiv:2009.13752,2020. [17]LAMPLE G,BALLESTEROS M,SUBRAMANIAN S,et al.Neural architectures for named entity recognition[J].arXiv:1603.01360,2016. [18]ZHOU J,CUI G,HU S,et al.Graph neural networks:A review of methods and applications[J].AI Open,2020,1:57-81. [19]XU W,CHEN K,ZHAO T.Discriminative reasoning for document-level relation extraction[J].arXiv:2106.01562,2021. [20]VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need[J].Advances in Neural Information Processing Systems,2017,30:5998-6008. [21]VERGA P,STRUBELL E,MCCALLUM A.Simultaneouslyself-attending to all mentions for full-abstract biological relation extraction[J].arXiv:1802.10569,2018. [22]YAO Y,YE D,LI P,et al.DocRED:A large-scale document-le-vel relation extraction dataset[J].arXiv:1906.06127,2019. [23]ZHU H,LIN Y,LIU Z,et al.Graph neural networks with generated parameters for relation extraction[J].arXiv:1902.00756,2019. [24]IMAMBI S,PRAKASH K B,KANAGACHIDAMBARESANG.PyTorch[M]//Programming with TensorFlow.Springer,2021:87-104. [25]LOSHCHILOV I,HUTTER F.Decoupled weight decay regularization[J].arXiv:1711.05101,2017. [26]PENNINGTON J,SOCHER R,MANNING C D.Glove:Global vectors for word representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Proces-sing(EMNLP).2014:1532-1543. [27]DEVLIN J,CHANG M W,LEE K,et al.Bert:Pre-training of deep bidirectional transformers for language understanding[J].arXiv:1810.04805,2018. [28]WANG L,CAO Z,DE MELO G,et al.Relation classification via multi-level attention cnns[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(Vo-lume 1:Long Papers).2016:1298-1307. [29]HOCHREITER S,SCHMIDHUBER J.Long short-term memory[J].Neural Computation,1997,9(8):1735-1780. [30]YE D,LIN Y,DU J,et al.Coreferential reasoning learning for language representation[J].arXiv:2004.06870,2020. [31]YU Y,SI X,HU C,et al.A review of recurrent neural networks:LSTM cells and network architectures[J].Neural computation,2019,31(7):1235-1270. [32]GUO Z,ZHANG Y,LU W.Attention guided graph convolu-tional networks for relation extraction[J].arXiv:1906.07510,2019. [33]TANG H,CAO Y,ZHANG Z,et al.HIN:Hierarchical infe-rence network for document-level relation extraction[C]//Advances in Knowledge Discovery and Data Mining:24th Pacific-Asia Conference(PAKDD 2020).Singapore.Springer International Publishing,2020:197-209. |
|