Computer Science ›› 2020, Vol. 47 ›› Issue (10): 222-227.doi: 10.11896/jsjkx.190900173

• Artificial Intelligence • Previous Articles     Next Articles

Comment Sentiment Classification Using Cross-attention Mechanism and News Content

WANG Qi-fa, WANG Zhong-qing, LI Shou-shan, ZHOU Guo-dong   

  1. School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China
  • Received:2019-09-25 Revised:2019-12-06 Online:2020-10-15 Published:2020-10-16
  • About author:WANG Qi-fa,born in 1994,postgra-duate,is a member of China Computer Federation.His main research interests include natural language processing and emotion analysis.
    WANG Zhong-qing,born in 1987,Ph.D,postgraduate supervisor,is a member of China Computer Federation.His main research interests include natural language understanding and information extraction.
  • Supported by:
    Young Scientists Fund of the National Natural Science Foundation of China (61806137,61702518) and Natural Science Foundation of the Jiangsu Higher Education Institutions of China (18KJB520043)

Abstract: At present,news comment has become important news derived data.News comment expresses commentators’ views,positions and personal feelings on news events.Through the analysis of sentiment orientation of news comment,it is helpful to understand the social public opinion and trend.Therefore,the sentiment research of news comment is favored by many scholars.The usual news comment sentiment analysis only considers the information of the comment text itself.However,news comment text information is often closely related to news content information.Based on this,this paper proposed a comment sentiment classification method using cross-attention mechanism and combined with news content.Firstly,the bi-directional long short-term memory network model is used to characterize the news content and the comment text respectively.Then,the cross-attention mechanism is used to further capture important information,and obtain the vector representation of the two updated news content texts and comment texts.And then the semantic representation obtained by splicing them together is input into the full connection layer,and sigmoid activation function is used for classification prediction,so as to realize the sentiment classification of news comments.The results show that the model of comment sentiment classification using cross-attention mechanism and news content can effectively improve the accuracy of sentiment classification of news comment,and this model improves by 1.72%,3.24% and 6.21% on F1 respectively compare with the three benchmark models.

Key words: Attention mechanism, Bi-directional Long Short-Term Memory Network, Sentiment Classification

CLC Number: 

  • TP391.1
[1]FAN W,SUN S.Sentiment classification for online comments on Chinese news[C]//International Conference on Computer Appli-cation & System Modeling.IEEE,2010.
[2]NING Y,ZHU T,WANG Y.Affective-word based Chinese text sentiment classification[C]//International Conference on Pervasive Computing & Applications.IEEE,2011.
[3]SU Z,HUA X,ZHANG D,et al.Chinese sentiment classification using a neural network tool-Word2vec[C]//International Conference on Multisensor Fusion & Information Integration for Intelligent Systems.2014.
[4]ZHANG D,XU H,SU Z,et al.Chinese comments sentimentclassification based on word2vec and SVMperf[J].Expert Systems with Applications,2015,42(4):1857-1863.
[5]ABDULLIN Y B,IVANOV V V.Deep learning model for bilingual sentiment classification of short texts[J].Naucˇno-tehnicˇeskijVestnik Informacionnyh Tehnologij,2017,17(1):129-136.
[6]ZHOU S,CHEN Q,WANG X.Active deep networks for semi-supervised sentiment classification[C]//Proceedings of the 23rd International Conference on Computational Linguistics:Posters.Association for Computational Linguistics,2010:1515-1523.
[7]LECUN Y,BOTTOU L,BENGIO Y,et al.Gradient-basedlearning applied to document recognition[J].Proceedings of the IEEE,1998,86(11):2278-2324.
[8]KIM Y.Convolutional Neural Networks for Sentence Classification[C]//Proceedings of the 2014 conference on empirical methods in natural language processing.2014:1746-1751.
[9]CAO Y,XU R,CHEN T.Combining Convolutional Neural Network and Support Vector Machine for Sentiment Classification[C]//Proceedings of the Meeting of the Chinese National Conference on Social Media Processing.2015:144-155.
[10]TANG D,QIN B,LIU T.Document modeling with gated recurrent neural network for sentiment classification[C]//Proceedings of the 2015 conference on empirical methods in natural language processing.2015:1422-1432.
[11]HOCHREITER S,SCHMIDHUBER,JÜRGEN.Long Short-Term Memory[J].Neural Computation,1997,9(8):1735-1780.
[12]WANG X,LIU Y,CHENGJIE S U N,et al.Predicting polarities of tweets by composing word embeddings with long short-term memory[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1:Long Papers).2015,1:1343-1353.
[13]TAI K S,SOCHER R,MANNING C D.Improved SemanticRepresentations From Tree-Structured Long Short-Term Memory Networks [J].Computer Science,2015,5(1):36.
[14]MNIH V,HEESS N,GRAVES A.Recurrent models of visual attention [J].Advances in Neural Information Processing Systems,2014(3):2204-2212.
[15]BAHDANAU D,CHO K,BENGIO Y.Neural Machine Translation by Jointly Learning to Align and Translate [C]//Proceedings of the ICLR.2015:1-15.
[16]LUONG M,MANNING C.D.Effective Approaches to Attention-based Neural Machine Translation [C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.2015:1412-1421.
[17]YANG Z,YANG D,DYER C,et al.Hierarchical Attention Networks for Document Classification[C]//Proceedings of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.2017:1480-1489.
[18]LI Z,WEI Y,ZHANG Y,et al.Hierarchical attention transfer network for cross-domain sentiment classification[C]//Thirty-Second AAAI Conference on Artificial Intelligence.2018.
[19]ZHOU X,WAN X,XIAO J.Attention-based LSTM network for cross-lingual sentiment classification[C]//Proceedings of the 2016 conference on empirical methods in natural language processing.2016:247-256
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[3] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[4] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[5] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[6] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[7] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[8] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[9] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[10] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[11] JIN Fang-yan, WANG Xiu-li. Implicit Causality Extraction of Financial Events Integrating RACNN and BiLSTM [J]. Computer Science, 2022, 49(7): 179-186.
[12] XIONG Luo-geng, ZHENG Shang, ZOU Hai-tao, YU Hua-long, GAO Shang. Software Self-admitted Technical Debt Identification with Bidirectional Gate Recurrent Unit and Attention Mechanism [J]. Computer Science, 2022, 49(7): 212-219.
[13] PENG Shuang, WU Jiang-jiang, CHEN Hao, DU Chun, LI Jun. Satellite Onboard Observation Task Planning Based on Attention Neural Network [J]. Computer Science, 2022, 49(7): 242-247.
[14] ZHANG Ying-tao, ZHANG Jie, ZHANG Rui, ZHANG Wen-qiang. Photorealistic Style Transfer Guided by Global Information [J]. Computer Science, 2022, 49(7): 100-105.
[15] ZENG Zhi-xian, CAO Jian-jun, WENG Nian-feng, JIANG Guo-quan, XU Bin. Fine-grained Semantic Association Video-Text Cross-modal Entity Resolution Based on Attention Mechanism [J]. Computer Science, 2022, 49(7): 106-112.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!