Computer Science ›› 2020, Vol. 47 ›› Issue (11): 244-249.doi: 10.11896/jsjkx.190900056

• Artificial Intelligence • Previous Articles     Next Articles

Event Temporal Relation Classification Method Based on Information Interaction Enhancement

ZHOU Xin-yu, LI Pei-feng   

  1. School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China
    Provincial Key Laboratory for Computer Information Processing Technology,Suzhou,Jiangsu 215006,China
  • Received:2019-09-09 Revised:2019-11-25 Online:2020-11-15 Published:2020-11-05
  • About author:ZHOU Xin-yu,born in 1996,postgradua-te,is a member of China Computer Federation.His main research interests include natural language processing.
    LI Pei-feng,born in 1971,Ph.D,professor,Ph.D supervisor,is a member of China Computer Federation.His main research interests include natural language processing and machine learning.
  • Supported by:
    This work was supported by the National Natural Science Foundation of China (61836007,61772354,61773276).

Abstract: As a branch of information extraction,event temporal relation classification has attracted more and more attention in recent years due to its good auxiliary effect on many natural language processing tasks.At present,the existing neural network approaches lack of consideration for the information interaction between events.To address this issue,this paper proposes a me-thod of event temporal relation classification based on parameter sharing to enhance information interaction between events.This method firstly learns the semantic information and context information of sentences through gated convolutional neural networks (GCNN),and incorporates them into the shortest dependency path sequence as input.Then,it uses Bidirectional long short-term memory network (Bi-LSTM) to encode the input and capture its semantic representation.In addition,it enhances the information interaction between event pairs by parameter sharing.Finally,the obtained semantic representation is input into the fully connec-ted layer,and the softmax function is used for classification prediction.Experimental results on TimeBank-Dense show that the proposed method outperforms most of the existing neural network methods in classification accuracy.

Key words: Bidirectional long and short-term memory network, Information interaction, Sentence representation, Shortest dependency path, Temporal relation classification

CLC Number: 

  • TP391.1
[1] MANI I,VERHAGEN M,WELLNER B,et al.Machine learning of temporal relations[C]//Proceedings of the Association for Computational Linguistics.Association for Computational Linguistics,2006:753-760.
[2] CHAMBERS N,WANG S,JURAFSKY D.Classifying tempo-ral relations between events[C]//Proceeding of the ACL on Interactive Poster and Demonstration Sessions.Association for Computational Linguistics,2007:173-176.
[3] CHAMBERS N,JURAFSKY D.Jointly combining implicit constraints improves temporal ordering[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing.Association for Computational Linguistics,2008:698-706.
[4] DO Q,LU W,ROTH D.Joint inference for event timeline construction[C]//Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning.Association for Computational Linguistics,2012:677-687.
[5] D'SOUZA J,NG V.Classifying temporal relations with richlinguistic knowledge[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics.Human Language Technologies,2013:918-927.
[6] CHENG F,MIYAO Y.Classifying temporal relations by bidirectional LSTM over dependency paths[C]//Proceedings of the Association for Computational Linguistics.Association for Computational Linguistics,2017:1-6.
[7] YAN X,MOU L,LI G,et al.Classifying relations via long short term memory networks along shortest dependency path[J].Computer Science,2015,42(1):56-61.
[8] CHOUBEY P K,HUANG R H.A sequential model for classifying temporal relations between intra-sentence events[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing.Association for Computational Linguistics,2017:1796-1802.
[9] YAO W,HUANG R.Temporal eventknowledge acquisition via identifying narratives[C]//Proceedings of the Association for Computational Linguistics.Association for Computational Linguistics,2018:537-547.
[10] TOURILLE J,FERRET O,TANNIER X,et al.Neural architecture for temporal relation extraction:A Bi-LSTM approach for detecting narrative containers[C]//Proceedings of the Association for Computational Linguistics.Association for Computational Linguistics,2017:224-230.
[11] MENG Y L,RUMSHISKY A.Context-Aware neural model for temporal information extraction[C]//Proceedings of the Association for Computational Linguistics.Association for Computational Linguistics,2018:527-536.
[12] ZHANG Y J,LI P F,ZHOU G D.Classifying temporal relations between events by deep BiLSTM[C]//Proceedings of International Conference on Asian Language Processing.Institute of Electrical and Electronics Engineers,2018:267-272.
[13] DAUPHIN Y N,FAN A,AULI M,et al.Language modelingwith gated convolutional networks[C]//Proceedings of the International Conference on Machine Learning.International Machine Learning Society,2017:933-941.
[14] MIRZA P,TONELLI S.On the contribution of word embeddings to temporal relation classification[C]//Proceedings of the International Conference on Computational Linguistics.2016:2818-2828.
[1] YU Jia-qi, KANG Xiao-dong, BAI Cheng-cheng, LIU Han-qing. New Text Retrieval Model of Chinese Electronic Medical Records [J]. Computer Science, 2022, 49(6A): 32-38.
[2] YANG Chun,LIU Jian-gang. Accident Information Interaction in the Virtual Team Based on Multi-agent [J]. Computer Science, 2009, 36(12): 164-166.
[3] WANG Hai-Peng ,ZHOU Xing-She, ZHANG Tao, XIANG Dong (School of Computer Science, Northwestern Polytechnical University, Xi'an 710072). [J]. Computer Science, 2005, 32(12): 72-74.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!