Computer Science ›› 2021, Vol. 48 ›› Issue (8): 226-233.doi: 10.11896/jsjkx.200700058

Special Issue: Natural Language Processing

• Artificial Intelligence • Previous Articles     Next Articles

Fine-grained Sentiment Analysis Based on Combination of Attention and Gated Mechanism

ZHANG Jin, DUAN Li-guo, LI Ai-ping, HAO Xiao-yan   

  1. College of Information and Computer,Taiyuan University of Technology,Taiyuan 030024,China
  • Received:2020-07-09 Revised:2020-08-14 Published:2021-08-10
  • About author:ZHANG Jin,born in 1996,postgra-duate.Her main research interests include sentiment analysis and natural language processing.(295511703@qq.com)DUAN Li-guo,born in 1970,Ph.D,associate professor,is a senior member of China Computer Federation.His main research interests include automatic question answering system,text sentiment analysis,entity relationship extraction and knowledge mapping.
  • Supported by:
    Basic Research Project of Shanxi Province(201801D121137).

Abstract: The fine-grained sentiment analysis is one of the key problems in the area of natural language processing.By learning contextual information of the text to conduct sentiment analysis on specific aspects,it can help users and businesses to better understand the sentiment information of specific aspects of users' comments.Aiming at the task of fine-grained sentiment analysis on users' comments,a text sentiment classification model combining BiGRU-attention and Gated Mechanisms is proposed.By integrating existing sentiment resources,HOWNET evaluation sentiment dictionary is used as the seed sentiment dictionary to expand the user comment sentiment dictionary through SO-PMI algorithm,the negative dictionary and part of speech information are combined to expand the user comment sentiment knowledge as the users' comment sentiment characteristic information.Introducing word,character and sentiment characteristics as the model of input infotmation,and using BiGRU to extract deep text features,then combined with gated mechanism as well as the attention mechanism,according to the acquired aspect word information to further extract the contextual sentiment characteristics related to aspect words,the final sentiment polarity is obtained by the softmax classfier.Experimental results show that the proposed model achieves better experimental results on the AIchallenger 2018 fine-grained sentiment analysis Chinese data sets,the Macro_F1_ score value reaches 0.7218,and the performance exceeds the baseline system.

Key words: Attention mechanism, BiGRU, Deep learning, Gated mechanism, Sentiment analysis

CLC Number: 

  • TP391
[1]LI Y,LI Z X,TENG L,et al.Comment Sentiment Analysis and Sentiment Words Detection Based on Attention Mechanism [J].Computer Science,2020,47(1):186-192.
[2]ZHAO Y Y,QIN B,LIU T.Sentimentanalysis[J].Journal of Software,2010,21(8):1834-1848.
[3]TANG X B,LIU G C.A Review of Studies on Fine-grained Sentiment Analysis[J].Library and Information Service,2017(5):132-140.
[4]DOHAIHA H H,PRASAD P,MAAG A,et al.Deep Learning for Aspect-Based Sentiment Analysis:A Comparative Review[J].Expert Systems with Applications,2019,118(1):272-299.
[5]LI Y H,XIE M,YI Y.Fine-grained Sentiment Analysis for Social Network Platform based on Deep-learning model [J].Application Research of Computers,2017,34(3):743-747.
[6]MALHOTRA P,VIG L,SHROFF G,et al.Long Short Term Memory Networks for Anomaly Detection in Time Series[C]//European Symposium on Artificial Neural Networks.2015.
[7]CHUNG J,GULCEHRE C,CHO K H,et al.Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[J].arXiv:1412.3555,2014.
[8]ZHANG Z,ROBINSON D,TEPPER J.Detecting hate speech on Twitter using a convolution-GRU based deep neural network[C]//ESWC 2018.Springer,Cham,2018.
[9]RAFFEL C,ELLIS D P W.Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems[J].arXiv:1512.08756,2015.
[10]TANG D,QIN B,FENG X,et al.Effective LSTMs for Target-Dependent Sentiment Classification[J].arXiv:1512.01100,2015.
[11]WANG Y,HUANG M,ZHU X,et al.Attention-based LSTM for Aspect-level Sentiment Classification[C]//Conference on Empirical Methods in Natural Language Processing.2016.
[12]HUANG B,OU Y,CARLEY K M.Aspect Level SentimentClassification with Attention-over-Attention Neural Networks[J].arXiv:1804.06536,2018.
[13]CHENG J,ZHAO S,ZHANG J,et al.Aspect-level Sentiment Classification with HEAT (HiErarchical ATtention) Network[C]//ACM.2017:97-106.
[14]WEI X,TAO L.Aspect Based Sentiment Analysis with GatedConvolutional Networks[J].arXiv:1805.07043,2018.
[15]SHUANG K,REN X,YANG Q,et al.AELA-DLSTMs:Attention-Enabledand Location-Aware Double LSTMs for aspect-level sentiment classification[J].Neurocomputing,2019,334:25-34.
[16]TURNEY P D,LITTMAN M L.Measuring praise and criti-cism:inference of semantic orientation from association[J].Acm Transactions on Information Systems,2003,21(4):315-346.
[17]CHUNG J,GULCEHRE C,CHO K H,et al.Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[J].arXiv:1412.3555,2014.
[18]JOZEFOWIC Z,RAFA L,ZAREMB A,et al.An Empirical Exploration of Recurrent Network Architectures[C]//Proceedings of the 32nd International Conference on Machine Learning.2015:2342-2350.
[19]DAUPHIN Y N,FAN A,AULI M,et al.Language Modeling with Gated Convolutional Networks[J].arXiv:1612.08083,2016.
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[3] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[4] XU Yong-xin, ZHAO Jun-feng, WANG Ya-sha, XIE Bing, YANG Kai. Temporal Knowledge Graph Representation Learning [J]. Computer Science, 2022, 49(9): 162-171.
[5] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[6] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[7] TANG Ling-tao, WANG Di, ZHANG Lu-fei, LIU Sheng-yun. Federated Learning Scheme Based on Secure Multi-party Computation and Differential Privacy [J]. Computer Science, 2022, 49(9): 297-305.
[8] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[9] WANG Jian, PENG Yu-qi, ZHAO Yu-fei, YANG Jian. Survey of Social Network Public Opinion Information Extraction Based on Deep Learning [J]. Computer Science, 2022, 49(8): 279-293.
[10] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[11] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[12] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[13] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[14] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[15] HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu. Advances in Chinese Pre-training Models [J]. Computer Science, 2022, 49(7): 148-163.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!