Computer Science ›› 2019, Vol. 46 ›› Issue (10): 258-264.doi: 10.11896/jsjkx.180901687

• Artificial Intelligence • Previous Articles     Next Articles

Sentiment Analysis Based on Word Embedding Auxiliary Mechanism

HAN Xu-li1, ZENG Bi-qing2, ZENG Feng1, ZHANG Min1, SHANG Qi1   

  1. (School of Computer Science,South China Normal University,Guangzhou 510631,China)1
    (School of Software,South China Normal University,Foshan,Guangdong 528225,China)2
  • Received:2018-09-09 Revised:2018-12-22 Online:2019-10-15 Published:2019-10-21

Abstract: Text sentiment analysis is an important research direction in natural language processing,and how to analyze the sentiment polarity of long text is a research hotspot.At present,majority of the researches tend to apply word embedding in neural network models for sentiment analysis.Although this method has a good representation ability of word features,it has some weaknesses for long text.Extremely long text will bring heavy burden to the model and make it consume more time and resources in training process.In light of this,this paper proposed an attention neural network model based on word embedding auxiliary mechanism,namely WEAN,and it is applied to sentiment analysis for long text.The model deals with some training burdens of long text in neural network model by using word embedding auxi-liary mechanism and utilizes bidirectional recurrent neural network and mechanism to obtain context information.At the same time,it captures information with different importance degree in sequences,thus improving the sentiment classification performance.Experiment was conducted on IMDB,Yelp 2013 and Yelp 2014 datasets.The results show that the accuracy of the sentiment analysis of the proposed model increases by 1.1%,2.0% and 2.6% respectively on three datasets compared with NSC+LA model.

Key words: Attention mechanism, Natural language processing, Neural network, Sentiment analysis, Word embedding

CLC Number: 

  • TP391
[1]WANG S,MANNING C D.Baselines and bigrams:simple,good sentiment and topic classification[C]//Meeting of the Association for Computational Linguistics:Short Papers.Jeju Island,Korea:ACL 2012:90-94.
[2]BENGIO Y.Learning Deep Architectures for AI [J].Founda-tions & Trends® in Machine Learning,2009,2(1):1-127.
[3]SOCHER R,LIN C,MANNING C D,et al.Parsing Natural Scenes and Natural Language with Recursive Neural Networks[C]//International Conference on Machine Learning.Bellevue,Washington:ICML,2011:129-136.
[4]MIKOLOV T,CHEN K,CORRADO G,et al.Efficient estimation of word representations in vector space[J].arXiv:1301.3781,2013.
[5]HOCHREITER S,SCHMIDHUBER J.Long short-term memory [J].Neural Computation,1997,9(8):1735-1780.
[6]KIM Y.Convolutional Neural Networks for Sentence Classification[J].Empirical Methods in Natural Language Processing,2014,1:1746-1751.
[7]ZHANG Y,GAN Z,FAN K,et al.Adversarial Feature Matching for Text Generation[C]//International Conference on Machine Learning.Sydney:ICAL,2017:4006-4015.
[8]GRAVES A,JAITLY N,MOHAMED A,et al.Hybrid speech recognition with Deep Bidirectional LSTM[C]//IEEE Automa-tic Speech Recognition and Understanding Workshop.Olomouc,Czech Republic:IEEE,2013:273-278.
[9]YANG Z,YANG D,DYER C,et al.Hierarchical Attention Networks for Document Classification[C]//Northamerican Chapter of the Association for Computational Linguistics.San Diego,California:ACL,2016:1480-1489.
[10]PANG B,LEE L,VAITHYANATHAN S,et al.Thumbs up? Sentiment Classification using Machine Learning Techniques[J].Empirical Methods in Natural Language Processing,2002,1(1):79-86.
[11]TURNEY P D.Thumbs Up or Thumbs Down? Semantic Orientation Applied to Unsupervised Classification of Reviews[J].Meeting of the Association for Computational Linguistics,2002,10:417-424.
[12]CONNEAU A,SCHWENK H,BARRAULT L,et al.Very deep convolutional networks for text classification[J].Conference of the European Chapter of the Association for Computational Linguistics,2017,1:1107-1116.
[13]SHEN D,MIN M R,LI Y,et al.Adaptive convolutional filter generation for natural language understanding[J].arXiv:2017,1709.08294.
[14]SHI B,FU Z,BING L,et al.Learning Domain-Sensitive and Sentiment-Aware Word Embeddings[J].Meeting of the Association for Computational Linguistics,2018,1(1):2494-2504.
[15]ADI Y,KERMANY E,BELINKOV Y,et al.Fine-grained Ana-lysis of Sentence Embeddings Using Auxiliary Prediction Tasks[J/OL].International Conference on Learning Representations,2017.https://openreview.net/group?id=ICLR.cc/2017/conference.
[16]KALCHBRENNER N,GREFENSTETTE E,BLUNSOM P,et al.A Convolutional Neural Network for Modelling Sentences[J].Meeting of the Association for Computational Linguistics,2014,1:655-665.
[17]JOHNSON R,ZHANG T.Effective Use of Word Order for Text Categorization with Convolutional Neural Networks[OL].http://xxx.tau.ac.il/pdf/1412.1058.pdf.
[18]ZHANG X,ZHAO J,LECUN Y.Character-level c-onvolutional networks for text classification [C]//Advances in Neural Information Processing Systems.Montreal,Canada:NPIS,2015:649-657.
[19]LI J,LUONG T,JURAFSKY D,et al.When Are Tree Struc-tures Necessary for Deep Learning of Representations[J].Empiricalmethods in Natural Language Processing,2015,1(1):2304-2314.
[20]LE Q V,MIKOLOV T.Distributed Representations of Sen-tences and Documents[OL].http://people.ee.duke.edu/~lcarin/ChunyuanLi4.17.2015.pdf.
[21]TANG D,QIN B,LIU T.Document Modeling with Gated Recurrent Neural Network for Sentiment Classification[C]//Conference on Empirical Methods in Natural Language Processing.Lisbon,Portugal:ACL,2015:1422-1432.
[22]VASWANI A,SHAZEER N,PARMAR N,et al.Attention is All you Need[C]//Neural Information Processing Systems.Long Beach:NIPS,2017:5998-6008.
[23]WANG Y,HUANG M,ZHU X,et al.Attention-based LSTM for Aspect-level Sentiment Classification[C]//Conference on Empirical Methods in Natural Language Processing.Austin,Texas:ACL,2017:606-615.
[24]MNIH V,HEESS N,GRAVES A.Recurrent models of visual attention[C]//Advances in Neural Information Processing Systems.Montreal,Canada:NPIS,2014:2204-2212.
[25]BAHDANAU D,CHO K,BENGIO Y,et al.Neural Machine Translation by Jointly Learning to Align and Translate[OL].https://arxiv-vanity.com/papers/1409.0473.
[26]LUONG T,PHAM H,MANNING C D,et al.Effective Approaches to Attention-based Neural Machine Translation[J].Empi-rical Methods in Natural Language Processing,2015,1:1412-1421.
[27]FAN R E,CHANG K W,HSIEH C J,et al.LIBLINEAR:A Library for Large Linear Classification [J].Journal of Machine Learning Research,2008,9(9):1871-1874.
[28]KIRITCHENKO S,ZHU X,MOHAMMAD S M,et al.Senti-ment analysis of short informal texts[J].Journal of Artificial Intelligence Research,2014,50(1):723-762.
[29]MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed Representations of Words and Phrases and their compositionality[J].Advances in Neural Information Processing Systems,2013,26:3111-3119.
[30]TANG D,WEI F,YANG N,et al.Learning sentiment-specific word embedding for twitter sentiment classification[C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics.Baltimore,Maryland,USA:ACL,2014,1:1555-1565.
[31]SOCHER R,PERELYGIN A,WU J,et al.Recursive deep mo-dels for semantic compositionality over a sentiment treebank[C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing.Seattle,Washington,USA:ACL.2013:1631-1642.
[32]LE Q,MIKOLOV T.Distributed representations of sentences and documents[C]//International Conference on Machine Learning.Beijing,China:ICAL,2014:1188-1196.
[1] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[2] NING Han-yang, MA Miao, YANG Bo, LIU Shi-chang. Research Progress and Analysis on Intelligent Cryptology [J]. Computer Science, 2022, 49(9): 288-296.
[3] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[4] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[5] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[6] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[7] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[8] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[9] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[10] WANG Run-an, ZOU Zhao-nian. Query Performance Prediction Based on Physical Operation-level Models [J]. Computer Science, 2022, 49(8): 49-55.
[11] CHEN Yong-quan, JIANG Ying. Analysis Method of APP User Behavior Based on Convolutional Neural Network [J]. Computer Science, 2022, 49(8): 78-85.
[12] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[13] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[14] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[15] QI Xiu-xiu, WANG Jia-hao, LI Wen-xiong, ZHOU Fan. Fusion Algorithm for Matrix Completion Prediction Based on Probabilistic Meta-learning [J]. Computer Science, 2022, 49(7): 18-24.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!