Computer Science ›› 2020, Vol. 47 ›› Issue (6A): 6-11.doi: 10.11896/JsJkx.191000007

• Artificial Intelligence • Previous Articles     Next Articles

Automatic Summarization Method Based on Primary and Secondary Relation Feature

ZHANG Ying, ZHANG Yi-fei, WANG Zhong-qing and WANG Hong-ling   

  1. School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China
  • Published:2020-07-07
  • About author:ZHANG Ying, born in 1994, postgra-duate, is a member of China Computer Federation.Her main research interests include natural language processing and so on.
    WANG Hong-ling, born in 1975, Ph.D, associate professor, is a member of China Computer Federation.Her main research interests include natural language processing and so on.
  • Supported by:
    This work was supported by the National Natural Science Foundation of China (61806137,61976146) and Natural Science Foundation of the Jiangsu Higher Education Institutions of China (18KJB520043).

Abstract: Automatic summarization technology refers to providing users with a concise text description by compressing and refining the original text while retaining the core idea of document.Usually,the traditional method only considers the shallow textual semantic information and neglects the guiding role of the structure information such as the primary and secondary relations in core sentences extraction.Therefore,this paper proposes an automatic summarization method based on the primary and secondary relation feature.This method utilizes the neural network to construct a single document extractive summarization model based on primary and secondary relation feature.The Bi-LSTM neural network model is used to encode the sentence information and primary and secondary relation information,and the LSTM neural network is utilized to summarize the encoded information.Experimental results show that the proposed method has a significant improvement in accuracy,stability and the ROUGE evaluation index compared with the current mainstream single document extractive summarization methods.

Key words: Extractive summarization, LSTM, Natural language processing, Neural network, Primary and secondary relation

CLC Number: 

  • TP391
[1] LUHN H P.The automatic creation of literature abstracts.IBM Journal of Research and Development,1958,2(2):159-165.
[2] CHU X M,ZHU Q M,ZHOU G D.Discourse Primary-Secondary Relationships in Natural Language Processing.Journal of Computer Science,2017,40(4):842-860.
[3] MIHALCEA R,TARAU P.TextRank:Bringing Order into Texts//Proc Conference on Empirical Methods in Natural Language Processing.2004:404-411.
[4] ERKAN G,RADEV D R.LexPageRank:Prestige in Multi-Do-cument Text Summariztion//Conference on Empirical Methodsin Natural Language Processing(EMNLP 2004).ACL,2004:365-371.
[5] KUPIEC J,PEDERSEN J,CHEN F.A trainable document sum-marizer//Proceedings of the 18th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,1995:68-73.
[6] CONROY J M,SCHLESINGER J D,DIANNE P O,et al.Using {HMM} and Logistic Regression to Generate Extract Summaries for {DUC}.温泉科学,2001,63(329):66-75.
[7] KIM Y.Convolutional neural networks for sentence classification.arXiv:1408.5882,2014.
[8] LIU P,QIU X,HUANG X.Recurrent neural network for text classification with multi-task learning.arXiv:1605.05101,2016.
[9] ZHOU C,SUN C,LIU Z,et al.A C-LSTM neural network for text classification.arXiv:1511.08630,2015.
[10] CHOPRA S,AULI M,RUSH A M.Abstractive sentence summarization with attentive recurrent neural networks//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.2016:93-98.
[11] RUSH A M,CHOPRA S,WESTON J.A neural attention model for abstractive sentence summarization.arXiv:1509.00685,2015.
[12] SEE A,LIU P J,MANNING C D.Get To The Point:Summarization with Pointer-Generator Net-works//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics.2017:1073-1083.
[13] CARLSON,LYNN,MARCU,et al.Building a dis-course-tagged corpus in the framework of Rhetorical Structure Theory//Current and New Directions in Discourse and Dialogue.Netherlands:Springer,2003:2655-2661.
[14] ZHOU Y,XUE N.The Chinese Discourse TreeBank:A Chinese corpus annotated with discourse relations.Language Resources and Evaluation,2015,49(2):397-431.
[15] HERNAULT H,PRENDINGER H,ISHIZUKA M.HILDA:A discourse parser using support vector machine classification.Dialogue & Discourse,2010,1(3).
[16] FENG V W,HIRST G.Text-level discourse parsing with rich linguistic features//Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics.2012:60-68.
[17] CHU X,WANG Z,ZHU Q,et al.Recognizing nuclearity between Chinese Discourse units//2015 International Conferenceon Asian Language Processing (IALP).IEEE,2015:197-200.
[18] SUN J,LI Y C,ZHOU G D,et al.Research of Chinese Implicit Discourse Relation Recognition.Journal of Peking University (Natural Science Edition),2014,50(1):111-117.
[19] SUN C,KONG F.A Transition-based Framework for Chinese Discourse Structure Parsing.Journal of Chinese Information Processing,2018,32(12):48-56.
[20] LI Y C.Research of Chinese Discourse Stucture Representation and Resource Construction.Suzhou:Soochow University of Defense Technology,2015.
[21] LIN C Y,HOVY E.Automatic Evaluation of Summaries Using n-gram Co-occurrence Statistics//Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology.Association for Computational Linguistics,2003.
[22] HUA L,WAN X,LI L.Overview of the NLPCC 2017 Shared Task:Single Document Summarization//National CCF Conference on Natural Language Processing and Chinese Computing.Cham:Springer,2017:942-947.
[23] WAN X,YANG J.Multi-document summarization using cluster-based link analysis//Proceedings of the 31st annual international ACM SIGIR Conference on Research and Development in Information Retrieval.ACM,2008:299-306.
[24] KIM Y.Convolutional neural networks for sentence classification.arXiv:1408.5882,2014.
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[3] NING Han-yang, MA Miao, YANG Bo, LIU Shi-chang. Research Progress and Analysis on Intelligent Cryptology [J]. Computer Science, 2022, 49(9): 288-296.
[4] WANG Run-an, ZOU Zhao-nian. Query Performance Prediction Based on Physical Operation-level Models [J]. Computer Science, 2022, 49(8): 49-55.
[5] CHEN Yong-quan, JIANG Ying. Analysis Method of APP User Behavior Based on Convolutional Neural Network [J]. Computer Science, 2022, 49(8): 78-85.
[6] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[7] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[8] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[9] HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu. Advances in Chinese Pre-training Models [J]. Computer Science, 2022, 49(7): 148-163.
[10] JIN Fang-yan, WANG Xiu-li. Implicit Causality Extraction of Financial Events Integrating RACNN and BiLSTM [J]. Computer Science, 2022, 49(7): 179-186.
[11] PENG Shuang, WU Jiang-jiang, CHEN Hao, DU Chun, LI Jun. Satellite Onboard Observation Task Planning Based on Attention Neural Network [J]. Computer Science, 2022, 49(7): 242-247.
[12] ZHAO Dong-mei, WU Ya-xing, ZHANG Hong-bin. Network Security Situation Prediction Based on IPSO-BiLSTM [J]. Computer Science, 2022, 49(7): 357-362.
[13] QI Xiu-xiu, WANG Jia-hao, LI Wen-xiong, ZHOU Fan. Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning [J]. Computer Science, 2022, 49(7): 18-24.
[14] ZHANG Yuan, KANG Le, GONG Zhao-hui, ZHANG Zhi-hong. Related Transaction Behavior Detection in Futures Market Based on Bi-LSTM [J]. Computer Science, 2022, 49(7): 31-39.
[15] YANG Bing-xin, GUO Yan-rong, HAO Shi-jie, Hong Ri-chang. Application of Graph Neural Network Based on Data Augmentation and Model Ensemble in Depression Recognition [J]. Computer Science, 2022, 49(7): 57-63.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!