Computer Science ›› 2021, Vol. 48 ›› Issue (6A): 349-356.doi: 10.11896/jsjkx.200800004

• Intelligent Computing • Previous Articles     Next Articles

Research on Sentiment Analysis Based on Transformer and Multi-channel Convolutional Neural Network

HUO Shuai1,2, PANG Chun-jiang1   

  1. 1 North China Electric Power University(Baoding),Baoding,Hebei 071003,China
    2 Graduate Workstation of Yunnan Power Grid Co.,Ltd.Electric Power Research Institute,Kunming 650217,China
  • Online:2021-06-10 Published:2021-06-17
  • About author:HUO Shuai,born in 1994,postgraduate.His main research interests include emotion analysis and deep learning.
    PANG Chun-jiang,born in 1965,asso-ciate professor.His main research inte-rests include artificial intelligence and internet of things.
  • Supported by:
    Yunnan Science and Technology Project(YNKJXM20180019,YNKJXM20191572).

Abstract: Text sentiment analysis is one of the classic fields of natural language processing.This paper proposes a text sentiment analysis model based on transformer feature extractor combined with multi-channel convolutional neural network.The model uses trsnsformer feature extractor to layer words and dynamically represent them on the basis of static word vectors trained by traditional Word2vector,Glove,etc.,and use Fine-Tuning for specific data sets for training,which effectively improves the representation of word vectors ability.The multi-channel convolutional neural network considers the dependence between word sequences in different size ranges,effectively extracts features and achieves the purpose of dimensionality reduction,can effectively capture the contextual semantic information of sentences,and enable the model to capture more semantic emotional information,improve the semantic expression ability of the text,and achieve the goal of emotional tendency classification through the Softmax activation function.The model is tested on the IMDb and SST-2 movie review datasets,and the accuracy rates on the test set reached 90.4% and 90.2%,indicating that the model proposed in this paper has better classification accuracy than the traditional word embedding combined with CNN or RNN.

Key words: Feature extractor, Multi-channel convolutional neural network, Sentiment classification, Transformer

CLC Number: 

  • TP391.1
[1] HU M Q,LIU B.Mining and Summarizing Customer Reviews[C]//Proc of the 10th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining.New York:ACM,2004:168-177.
[2] PANG B,LEE L,VAITHYANATHAN S.Thumbs up? Sentiment Classification using Machine Learning Techniques[C]//Proc of Empirical methods in Natural Language Processing.Cambridge,MA:MIT Press,2002:79-86.
[3] SALEH M R,MART N-VAILDIVIA M T,MONTEJO-R E,et al.Experiments with SVM to classify opinions in different domains[J].Expert Systems with Applications,2011,38(12):14799-14804.
[4] KIM Y.Convolutional neural networks for sentence classification[C]//Proceedings of 2014 Conference on Empinical Me-thods in Natural Language Processing.Daha,Qatar,2014:1746-1751.
[5] ZHU X D,SOBIHANI P.Long short-term menory over recursive structures [C]//Proc.of Int.Conf.on Machine Learning.New York:ACM,2015:1604-1612.
[6] YUAN H J,ZHANG X,NIU W H,et al.Research on text sen-timent analysis of multi-channel convolution and two-way GRU model with attention mechanism[J].Journal of Chinese Information Processing,2019,33(10):109-118.
[7] ZHAO Y O,ZHANG J Z,LI Y B,et al.Sentiment analysis combining word embedding based on language model and multi-scale convolutional neural network[J].Journal of Computer Applications,2020,40(3):651-657.
[8] MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed representations of words and phrases and their compositionality[C]//NeurIPS.2013.
[9] PENNINGTON J,SOCHER R,MANNING C D.GloVe:Global vectors for word representation[C]//EMNLP.2014.
[10] PETERS M E,NEUMAN N,et al.Deep contextualized wordrepresentations [C]//Proceedings of North American Chapter of the Association for Computational Linguistics:ACL,2018:1-9.
[11] VASWANI A,SHAZEER N,PARMAR N,et al.Attention is all you need [C]//Proceedingsof Annual Conference on Neural Information Processing Systems.Long Beach,USA,2017:1-5.
[12] YOSINSKI J,CLUNE J,BENGIO Y,et al.How transferable are features in deep neural networks?[C]//Advances in Neural Information Processing Systems.2017:6000-6010.
[13] QIAN Q,TIAN B,HUANG M,et al.Learning tag embeddings and tag-specific composition functions in recursive neural network[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing.Stroudsburg:Association for Computational Linguistics,2015:1365-1374.
[14] TAI K S,SOCHER R,MANNING C D.Improved semantic representations from tree-structured long short term memory networks [C]//Proceedings of Annual Meeting of the Association for Computational Linguistics.ACL,2015:1556-1566.
[15] RADFORD A,NARASIMHAN K,SALIMANS T,et al.Improving language understanding by generative pre-training[OL].https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf.
[16] DEVLIN J,CHANG M W,LEE K,et al.BERT:pre-training of deep bidirectional transformers for language understanding[C]//NAACL-HLT.2019.
[17] ZOPH B,GHIASI G,et al.2020.Rethinking Pre-training and Self-training[OL].https://arxiv.org/abs/2006.06882.
[18] ZHANG Y,WALLACE B.A Sensitivity Analysis of (and Practitioners' Guide to) Convolutional Neural Networks for Sentence Classification[OL].https://arxiv.org/abs/1510.03820.
[1] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[2] ZHANG Jia-hao, LIU Feng, QI Jia-yin. Lightweight Micro-expression Recognition Architecture Based on Bottleneck Transformer [J]. Computer Science, 2022, 49(6A): 370-377.
[3] KANG Yan, XU Yu-long, KOU Yong-qi, XIE Si-yu, YANG Xue-kun, LI Hao. Drug-Drug Interaction Prediction Based on Transformer and LSTM [J]. Computer Science, 2022, 49(6A): 17-21.
[4] LIN Xi, CHEN Zi-zhuo, WANG Zhong-qing. Aspect-level Sentiment Classification Based on Imbalanced Data and Ensemble Learning [J]. Computer Science, 2022, 49(6A): 144-149.
[5] ZHAO Xiao-hu, YE Sheng, LI Xiao. Multi-algorithm Fusion Behavior Classification Method for Body Bone Information Reconstruction [J]. Computer Science, 2022, 49(6): 269-275.
[6] LU Liang, KONG Fang. Dialogue-based Entity Relation Extraction with Knowledge [J]. Computer Science, 2022, 49(5): 200-205.
[7] PAN Zhi-hao, ZENG Bi, LIAO Wen-xiong, WEI Peng-fei, WEN Song. Interactive Attention Graph Convolutional Networks for Aspect-based Sentiment Classification [J]. Computer Science, 2022, 49(3): 294-300.
[8] YANG Hui-min, MA Ting-huai. Compound Conversation Model Combining Retrieval and Generation [J]. Computer Science, 2021, 48(8): 234-239.
[9] YANG Jin-cai, CAO Yuan, HU Quan, SHEN Xian-jun. Relation Classification of Chinese Causal Compound Sentences Based on Transformer Model and Relational Word Feature [J]. Computer Science, 2021, 48(6A): 295-298.
[10] YIN Jiu, CHI Kai-kai, HUAN Ruo-hong. Aspect-level Sentiment Analysis of Text Based on ATT-DGRU [J]. Computer Science, 2021, 48(5): 217-224.
[11] CHEN Qian, CHE Miao-miao, GUO Xin, WANG Su-ge. Recurrent Convolution Attention Model for Sentiment Classification [J]. Computer Science, 2021, 48(2): 245-249.
[12] JIANG Qi, SU Wei, XIE Ying, ZHOUHONG An-ping, ZHANG Jiu-wen, CAI Chuan. End-to-End Chinese-Braille Automatic Conversion Based on Transformer [J]. Computer Science, 2021, 48(11A): 136-141.
[13] YU Shan-shan, SU Jin-dian, LI Peng-fei. Sentiment Classification Method for Sentences via Self-attention [J]. Computer Science, 2020, 47(4): 204-210.
[14] HUO Dan, ZHANG Sheng-jie, WAN Lu-jun. Context-based Emotional Word Vector Hybrid Model [J]. Computer Science, 2020, 47(11A): 28-34.
[15] JING Li, LI Man-man, HE Ting-ting. Sentiment Classification of Network Reviews Combining Extended Dictionary and Self-supervised Learning [J]. Computer Science, 2020, 47(11A): 78-82.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!