Computer Science ›› 2020, Vol. 47 ›› Issue (8): 164-170.doi: 10.11896/jsjkx.190600153

Previous Articles     Next Articles

Opinion Word-pairs Collaborative Extraction Based on Dependency Relation Analysis

ZHAO Wei1, 2, LIN Yu-ming1, WANG Chao-qiang1, CAI Guo-yong1   

  1. 1 Guangxi Key Laboratory of Trusted Software, Guilin University of Electronic Technology, Guilin, Guangxi 541004, China
    2 School of Data Science & Engineering, East China Normal University, Shanghai 200062, China
  • Online:2020-08-15 Published:2020-08-10
  • About author:ZHAO Wei, born in 1995, postgraduate.His main research interests include opinion mining and so on.
    LIN Yu-ming, born in 1978, Ph.D, professor.His main research interests include opinion mining, knowledge graph, and massive data management.
  • Supported by:
    This work was supported by the Guangxi Natural Science Foundation (2018GXNSFDA281049), National Natural Science Foundation of China (61662015, U1711263), Science and Technology Major Project of Guangxi Province (AA19046004) and Innovation Project of Guet Graduate Education (2018YJCX48) and Project of Guangxi Key Laboratory of Trusted Software(kx201916).

Abstract: In the same category of commodities, opinion word-pairs usually have strong opinion dependence relation to the opinion targets and the opinion words contained in them.Therefore, in the extraction process of opinion word-pairs, they can be extracted by analyzing the opinion dependence relations among the words in the review sentences.Firstly, a dependency relation analysis model is constructed to obtain the dependency relation information of each word in a review sentence, and the basic model is defined as LSTM neural network.Secondly, it is assumed that one of the item that opinion word-pairs contained in review sentence is known, and the known item is used as the model’s attention information, so that the model can focus on extracting the words of phrases associated with the known item with strong opinion dependence from the review sentence as another unknown item in the opinion word-pairs.Finally, the word-pairs with the highest score of the opinion dependence relation are output as the opinion word-pairs.Then a compound model is designed to realize the mining of opinion word pairs without knowing the known items in advance by combining the two models which contain the information of different known items in the opinion word-pairs.

Key words: Attention mechanism, Neural network, Opinion dependency relation analysis, Opinion pair

CLC Number: 

  • TP391
[1]DING X, LIU B, YU P S.A holistic lexicon-based approach to opinion mining[C]∥International Conference on Web Search & Data Mining.2008:231-240.
[2]WU Y, ZHANG Q, HUANG X, et al.Phrase Dependency Parsing for Opinion Mining[C]∥Proceedings of the 2009 Confe-rence on Empirical Methods in Natural Language Processing.2009:1533-1541.
[3]XU L, LIU K, LAI S, et al.Mining Opinion Words and Opinion Targets in a Two-Stage Framework[C]∥Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics.2013:1764-1773.
[4]WU C, WU F, WU S, et al.A Hybrid Unsupervised Method for Aspect Term and Opinion Target Extraction[J].Knowledge-Based Systems, 2018:148:66-73.
[5]DANILO E, DENILSON G, IVES P, et al.Analysis of Document Pre-Processing Effects in Text and Opinion Mining[J].Information, 2018, 9(4):100.
[6]AGERRI R, RIGAU G.Language Independent Sequence Labelling for Opinion Target Extraction[J].Artificial Intelligence, 2019, 268:85-95.
[7]DING X, LIU B.Resolving Object and Attribute Coreference in Opinion Mining[C]∥The 23rd, International Conference on Computational Linguistics Proceedings of the Main Conference.2010:268-276.
[8]ALMEIDA M S C, PINTO C, et al.Martins.Aligning Opinions:Cross-Lingual Opinion Mining with Dependencies[C]∥Meeting of the Association for Computational Linguistics & Internatio-nal Joint Conference on Natural Language Processing.2015:408-418.
[9]NGUYEN Thi Thanh Thuy, NGO Xuan Bach, et al.Cross-Language Aspect Extraction for Opinion Mining[C]∥10th International Conference on Knowledge and Systems Engineering.2018:67-72.
[10]TU W, CHEUNG D, MAMOULIS N.Time-sensitive opinionmining for prediction[J].Association for the Advancement of Artificial Intelligence, 2015:4214-4215.
[11]ZHANG L, LIM S H, LIU B.Extracting and Ranking Product Features in Opinion Documents[C]∥International Conference on Computational Linguistics.2010:1462-1470.
[12]HAI Z, CHANG K, CONG G.One seed to find them all:mining opinion features via association[C]∥Proceedings of the 21st ACM International Conference on Information and Knowledge Management.2012:255-264.
[13]QIU G, LIU B, BU J, et al.Opinion Word Expansion and Target Extraction through Double Propagation[J].Computational Linguistics, 2011, 37(1):9-27.
[14]BISHAN YANG C C.Joint Inference for Fine-grained OpinionExtraction[C]∥Meeting of the Association for Computational Linguistics.2013:1640-1649.
[15]MOJICA L G, NG V.Fine-Grained Opinion Extraction with
Markov Logic Networks[C]∥2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA).2015:271-276.
[16]JIANG X, LIN Y, LI Y, et al.Collective Extraction for Opinion Targets and Opinion Words from Online Reviews[C]∥7th International Conference on Cloud Computing and Big Data.2016:367-373.
[17]LIN Y, JIANG X, et al.Semi-supervised collective extraction of opinion target and opinion word from online reviews based on active labeling[J].Journal of Intelligent and Fuzzy Systems, 2017, 33(6):3949-3958.
[18]WANG H, ZHANG C, YIN H, et al.A Unified Framework for Fine-Grained Opinion Mining from Online Reviews[C]∥49th Hawaii International Conference on System Sciences (HICSS).2016:1134-1143.
[19]LADDHA A, MUKHERJEE A.Aspect Specific Opinion Ex-pression Extraction using Attention based LSTM-CRF Network[J].CoRR abs.2019:1902.02709.
[1] ZHOU Fang-quan, CHENG Wei-qing. Sequence Recommendation Based on Global Enhanced Graph Neural Network [J]. Computer Science, 2022, 49(9): 55-63.
[2] DAI Yu, XU Lin-feng. Cross-image Text Reading Method Based on Text Line Matching [J]. Computer Science, 2022, 49(9): 139-145.
[3] ZHOU Le-yuan, ZHANG Jian-hua, YUAN Tian-tian, CHEN Sheng-yong. Sequence-to-Sequence Chinese Continuous Sign Language Recognition and Translation with Multi- layer Attention Mechanism Fusion [J]. Computer Science, 2022, 49(9): 155-161.
[4] XIONG Li-qin, CAO Lei, LAI Jun, CHEN Xi-liang. Overview of Multi-agent Deep Reinforcement Learning Based on Value Factorization [J]. Computer Science, 2022, 49(9): 172-182.
[5] RAO Zhi-shuang, JIA Zhen, ZHANG Fan, LI Tian-rui. Key-Value Relational Memory Networks for Question Answering over Knowledge Graph [J]. Computer Science, 2022, 49(9): 202-207.
[6] NING Han-yang, MA Miao, YANG Bo, LIU Shi-chang. Research Progress and Analysis on Intelligent Cryptology [J]. Computer Science, 2022, 49(9): 288-296.
[7] WANG Ming, PENG Jian, HUANG Fei-hu. Multi-time Scale Spatial-Temporal Graph Neural Network for Traffic Flow Prediction [J]. Computer Science, 2022, 49(8): 40-48.
[8] HAO Zhi-rong, CHEN Long, HUANG Jia-cheng. Class Discriminative Universal Adversarial Attack for Text Classification [J]. Computer Science, 2022, 49(8): 323-329.
[9] JIANG Meng-han, LI Shao-mei, ZHENG Hong-hao, ZHANG Jian-peng. Rumor Detection Model Based on Improved Position Embedding [J]. Computer Science, 2022, 49(8): 330-335.
[10] WANG Run-an, ZOU Zhao-nian. Query Performance Prediction Based on Physical Operation-level Models [J]. Computer Science, 2022, 49(8): 49-55.
[11] CHEN Yong-quan, JIANG Ying. Analysis Method of APP User Behavior Based on Convolutional Neural Network [J]. Computer Science, 2022, 49(8): 78-85.
[12] ZHU Cheng-zhang, HUANG Jia-er, XIAO Ya-long, WANG Han, ZOU Bei-ji. Deep Hash Retrieval Algorithm for Medical Images Based on Attention Mechanism [J]. Computer Science, 2022, 49(8): 113-119.
[13] SUN Qi, JI Gen-lin, ZHANG Jie. Non-local Attention Based Generative Adversarial Network for Video Abnormal Event Detection [J]. Computer Science, 2022, 49(8): 172-177.
[14] YAN Jia-dan, JIA Cai-yan. Text Classification Method Based on Information Fusion of Dual-graph Neural Network [J]. Computer Science, 2022, 49(8): 230-236.
[15] JIN Fang-yan, WANG Xiu-li. Implicit Causality Extraction of Financial Events Integrating RACNN and BiLSTM [J]. Computer Science, 2022, 49(7): 179-186.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!