计算机科学 ›› 2022, Vol. 49 ›› Issue (6A): 119-124.doi: 10.11896/jsjkx.210600150

• 智能计算 • 上一篇    下一篇

基于跨句上下文信息的神经网络关系分类方法

黄少滨, 孙雪薇, 李熔盛   

  1. 哈尔滨工程大学计算机科学与技术学院 哈尔滨 150001
  • 出版日期:2022-06-10 发布日期:2022-06-08
  • 通讯作者: 李熔盛(dasheng@hrbeu.edu.cn)
  • 作者简介:(huangshaobin@hrbeu.edu.cn)

Relation Classification Method Based on Cross-sentence Contextual Information for Neural Network

HUANG Shao-bin, SUN Xue-wei, LI Rong-sheng   

  1. College of Computer Science and Technology,Harbin Engineering University,Harbin 150001,China
  • Online:2022-06-10 Published:2022-06-08
  • About author:HUANG Shao-bin,born in 1965,Ph.D,professor,Ph.D supervisor.His main research interests include natural language processing and deep learning,etc.
    LI Rong-sheng,born in 1991,Ph.D.His main research interests include natural language processing and deep learning,etc.

摘要: 关系分类作为信息抽取的核心任务和重要环节,能够实现实体对间语义关系的识别。近年来,深度学习在关系抽取任务中取得了显著的成果。到目前为止,研究者们的努力方向主要集中在对神经网络模型进行改进,但是对于不同句子之间语义关系密切的文本类型尚缺乏有效的方法来获取段落或篇章级别的跨句语义信息。针对此类段落或篇章级的关系抽取数据集,提出了一种将句子结合其跨句上下文信息共同作为神经网络模型输入的方法,使模型能够学习到更多段落或篇章级别的语义关联信息。在不同的神经网络模型上,分别引入了跨句上下文信息,并在不同领域的两个关系分类数据集上进行了实验,对比了引入跨句上下文信息与否对模型精度的影响,实验表明该方法能够有效提升神经网络模型的关系分类性能。此外,提出了一个基于四险一金领域政策法规文本的关系分类数据集Policy,用于验证在某些实际领域的关系分类任务中引入跨句上下文信息的必要性。

关键词: 关系分类, 句子相似度, 跨句上下文信息, 神经网络, 四险一金

Abstract: Information extraction is a technique of extracting specific information from textual data.It has been widely used in knowledge graph,information retrieval,question answering system,sentiment analysis and text mining.As the core task and important part of information extraction,relation classification can realize the recognition of semantic relations between entities.In recent years,deep learning has made remarkable achievements in relation extraction tasks.So far,researchers have focused their efforts on improving neural network models,but there is still a lack of effective methods to obtain cross-sentence semantic information from paragraphs or discourse level texts with close semantic relationships between different sentences.However,semantic relationships between sentences for relation extraction tasks are of great use.In this paper,for such paragraphs or discourse level relation extraction datasets,a method to combine sentences with their cross-contextual information as the input of the neural network model is proposed,so that the model can learn more semantic information from paragraphs or discourse level texts.Cross-sentence contex tual information is introduced into different neural network models,and experiments are carried out on two relation classification datasets in different fields including San Wen dataset and Policy dataset.The effects of cross-sentence contex-tual information on model accuracy are compared.The experiment show that the proposed method can effectively improve the performance of relation classification models including Convolutional neural network,bidirectional long short-term memory network,attention-based bidirectional long short-term memory network and convolutional recurrent neural network.In addition,this paper proposes a relation classification dataset named Policy based on the texts of policies and regulations in the field of four social insurance and one housing fund,which is used to verify the necessity of introducing cross-sentence contextual information into the relation classification tasks in some practical fields.

Key words: Cross-sentence contextual information, Four social insurance and one housing fund, Neural network, Relation classification, Sentence similarity

中图分类号: 

  • TP391.1
[1] LIU C Y,SUN W B,CHAO W H,et al.Convolution NeuralNetwork for Relation Extraction[C]//International Conference on Advanced Data Mining and Applications.Springer Berlin Heidelberg,2013.
[2] ZENG D,LIU K,LAI S,et al.Relation classification via convolutional deep neural network[C]//25th International Confe-rence on Computational Linguistics:Technical Papers(COLING 2014).2014:2335-2344.
[3] SANTOS C N D,XIANG B,ZHOU B W.Classifying Relations by Ranking with Convolutional Neural Networks[J].Computer Science,2015,86(86):132-137.
[4] QIN P,XU W,GUO J.An empirical convolutional neural net-work approach for semantic relation classification[J].Neurocomputing,2016,190:1-9.
[5] REN F,ZHOU D,LIU Z,et al.Neural relation classification with text descriptions[C]//Proceedings of the 27th InternationalConference on Computational Linguistics.2018:1167-1177.
[6] ZENG D,LIU K,CHEN Y,et al.Distant supervision for relation extraction via piecewise convolutional neural networks[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.2015:1753-1762.
[7] JI G,LIU K,HE S,et al.Distant supervision for relation extraction with sentence-level attention and entity descriptions[C]//Thirty-First AAAI Conference on Artificial Intelligence.2017.
[8] ZHANG D,WANG D.Relation classification via recurrent neural network[J].arXiv:1508.01006,2015.
[9] ZHOU P,SHI W,TIAN J,et al.Attention-based bidirectional long short-term memory networks for relation classification[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics(Short Papers).2016:207-212.
[10] DU J,HAN J,WAY A,et al.Multi-level structured self-attentions for distantly supervised relation extraction[J].arXiv:1809.00699,2018.
[11] VU N T,ADEL H,GUPTA P.Combining Recurrent and Convolutional Neural Networks for Relation Classification[C]//Proceedings of NAACL-HLT.2016:534-539.
[12] TRAN V H,PHI V T,SHINDO H,et al.Relation Classification Using Segment-Level Attention-based CNN and Dependency-based RNN[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies,Volume 1(Long and Short Papers).2019:2793-2798.
[13] SOARES L B,FITZGERALD N,LING J,et al.Matching the blanks:Distributional similarity for relation learning[J].arXiv:1906.03158,2019.
[14] LEE J,SEO S,CHOI Y S.Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing[J].Symmetry,2019,11(6):785.
[15] WU S,HE Y.Enriching Pre-trained Language Model with Entity Information for Relation Classification[C]//the 28th ACM International Conference.ACM,2019.
[16] LI Z,DING N,LIU Z,et al.Chinese Relation Extraction with Multi-Grained Information and External Linguistic Knowledge[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.2019.
[17] ZHU H,LIN Y,LIU Z,et al.Graph Neural Networks with Generated Parameters for Relation Extraction[C]//Meeting of the Association for Computational Linguistics.2019.
[18] ZHANG Y,GUO Z,LU W.Attention guided graph convolutional networks for relation extraction[J].arXiv:1906.07510,2019.
[19] WADDEN D,WENNBERG U,LUAN Y,et al.Entity,relation,and event extraction with contextualized span representations[C]//Empirical Methods in Natural Language Processing(EMNLP). 2019:5788-5793.
[20] LIN C,MILLER T,DLIGACH D,et al.A BERT-based One-Pass Multi-Task Model for Clinical Temporal Relation Extraction[C]//Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing.2020:70-75.
[21] SONG R,CHEN X,HONG Y,et al.Combination of Convolutional Recurrent Neutral Network for Relation Extraction[J].Journal of Chinese Information Processing,2019,33(10):64-72.
[1] 周芳泉, 成卫青.
基于全局增强图神经网络的序列推荐
Sequence Recommendation Based on Global Enhanced Graph Neural Network
计算机科学, 2022, 49(9): 55-63. https://doi.org/10.11896/jsjkx.210700085
[2] 周乐员, 张剑华, 袁甜甜, 陈胜勇.
多层注意力机制融合的序列到序列中国连续手语识别和翻译
Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion
计算机科学, 2022, 49(9): 155-161. https://doi.org/10.11896/jsjkx.210800026
[3] 宁晗阳, 马苗, 杨波, 刘士昌.
密码学智能化研究进展与分析
Research Progress and Analysis on Intelligent Cryptology
计算机科学, 2022, 49(9): 288-296. https://doi.org/10.11896/jsjkx.220300053
[4] 李宗民, 张玉鹏, 刘玉杰, 李华.
基于可变形图卷积的点云表征学习
Deformable Graph Convolutional Networks Based Point Cloud Representation Learning
计算机科学, 2022, 49(8): 273-278. https://doi.org/10.11896/jsjkx.210900023
[5] 郝志荣, 陈龙, 黄嘉成.
面向文本分类的类别区分式通用对抗攻击方法
Class Discriminative Universal Adversarial Attack for Text Classification
计算机科学, 2022, 49(8): 323-329. https://doi.org/10.11896/jsjkx.220200077
[6] 王润安, 邹兆年.
基于物理操作级模型的查询执行时间预测方法
Query Performance Prediction Based on Physical Operation-level Models
计算机科学, 2022, 49(8): 49-55. https://doi.org/10.11896/jsjkx.210700074
[7] 陈泳全, 姜瑛.
基于卷积神经网络的APP用户行为分析方法
Analysis Method of APP User Behavior Based on Convolutional Neural Network
计算机科学, 2022, 49(8): 78-85. https://doi.org/10.11896/jsjkx.210700121
[8] 朱承璋, 黄嘉儿, 肖亚龙, 王晗, 邹北骥.
基于注意力机制的医学影像深度哈希检索算法
Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism
计算机科学, 2022, 49(8): 113-119. https://doi.org/10.11896/jsjkx.210700153
[9] 檀莹莹, 王俊丽, 张超波.
基于图卷积神经网络的文本分类方法研究综述
Review of Text Classification Methods Based on Graph Convolutional Network
计算机科学, 2022, 49(8): 205-216. https://doi.org/10.11896/jsjkx.210800064
[10] 闫佳丹, 贾彩燕.
基于双图神经网络信息融合的文本分类方法
Text Classification Method Based on Information Fusion of Dual-graph Neural Network
计算机科学, 2022, 49(8): 230-236. https://doi.org/10.11896/jsjkx.210600042
[11] 齐秀秀, 王佳昊, 李文雄, 周帆.
基于概率元学习的矩阵补全预测融合算法
Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning
计算机科学, 2022, 49(7): 18-24. https://doi.org/10.11896/jsjkx.210600126
[12] 杨炳新, 郭艳蓉, 郝世杰, 洪日昌.
基于数据增广和模型集成策略的图神经网络在抑郁症识别上的应用
Application of Graph Neural Network Based on Data Augmentation and Model Ensemble in Depression Recognition
计算机科学, 2022, 49(7): 57-63. https://doi.org/10.11896/jsjkx.210800070
[13] 张颖涛, 张杰, 张睿, 张文强.
全局信息引导的真实图像风格迁移
Photorealistic Style Transfer Guided by Global Information
计算机科学, 2022, 49(7): 100-105. https://doi.org/10.11896/jsjkx.210600036
[14] 戴朝霞, 李锦欣, 张向东, 徐旭, 梅林, 张亮.
基于DNGAN的磁共振图像超分辨率重建算法
Super-resolution Reconstruction of MRI Based on DNGAN
计算机科学, 2022, 49(7): 113-119. https://doi.org/10.11896/jsjkx.210600105
[15] 刘月红, 牛少华, 神显豪.
基于卷积神经网络的虚拟现实视频帧内预测编码
Virtual Reality Video Intraframe Prediction Coding Based on Convolutional Neural Network
计算机科学, 2022, 49(7): 127-131. https://doi.org/10.11896/jsjkx.211100179
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!