Computer Science ›› 2025, Vol. 52 ›› Issue (10): 258-265.doi: 10.11896/jsjkx.250100114

• Artificial Intelligence • Previous Articles     Next Articles

Text Simplification for Aspect-based Sentiment Analysis Based on Large Language Model

WANG Ye, WANG Zhongqing   

  1. School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China
  • Received:2025-01-17 Revised:2025-05-20 Online:2025-10-15 Published:2025-10-14
  • About author:WANG Ye,born in 2002,postgraduate.Her main research interests include na-tural language processing and sentiment analysis.
    WANG Zhongqing,born in 1987,Ph.D,associate professor.His main research interests include natural language processing and sentiment analysis.
  • Supported by:
    National Natural Science Foundation of China(62076175,61976146).

Abstract: Aspect-based sentiment analysis aims to identify the sentiment polarity of each aspect in a sentence.However,most existing approaches overlook the redundant and irrelevant information often present in review texts,which not only complicates model processing,but also hinders accurate sentiment element extraction.To address this issue,this paper proposes a model that transforms the original text into simplified clauses,expressing the same sentiment in a more concise manner.The key idea is to leverage a large language model to pre-identify aspect and opinion terms in the text,and then generate simplified clauses based on these identified sentiment elements.A self-verification mechanism is employed to ensure the generated clause satisfy three criteria:sentiment consistency,relevance,and conciseness.Furthermore,the model jointly uses both the original text and the simplified clauses to generate sentiment elements.Experimental results on public datasets—Restaurant,Laptop,and Phone,demonstrate that the model outperforms existing baselines,highlighting the significance of simplified clauses in aspect-based sentiment analysis.

Key words: Aspect-based sentiment analysis,Text simplification, Large language model,Self-validation,Natural language proces-sing

CLC Number: 

  • TP391
[1]LIU P,JOTY S,MENG H.Fine-grained opinion mining with recurrent neural networks and word embeddings[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing.2015:1433-1443.
[2]XU H,LIU B,SHU L,et al.Double embeddings and CNN-based sequence labeling for aspect extraction[J].arXiv:1805.04601,2018.
[3]ZHOU X,WAN X,XIAO J.Representation learning for aspect category detection in online reviews[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2015.
[4]PENG H,XU L,BING L,et al.Knowing what,how and why:A near complete solution for aspect-based sentiment analysis[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2020:8600-8607.
[5]WAN H,YANG Y,DU J,et al.Target-aspect-sentiment jointdetection for aspect-based sentiment analysis[C]//Proceedings of the AAAI Conference on Artificial Intelligence.2020:9122-9129.
[6]ZHANG W,LI X,DENG Y,et al.A survey on aspect-basedsentiment analysis:Tasks,methods,and challenges[J].IEEE Transactions on Knowledge and Data Engineering,2022,35(11):11019-11038.
[7]YAN H,DAI J,QIU X,et al.A unified generative framework for aspect-based sentiment analysis[J].arXiv:2106.04300,2021.
[8]ZHANG W,LI X,DENG Y,et al.Towards generative aspect-based sentiment analysis[C]//ACL 2021.2021.
[9]LIU J,TENG Z,CUIL,et al.Solving aspect category sentiment analysis as a text generation task[J].arXiv:2110.07310,2021.
[10]ZHANG W,DENG Y,LI X,et al.Aspect sentiment quad prediction as paraphrase generation[J].arXiv:2110.00796,2021.
[11]LU Y,LIU Q,DAI D,et al.Unified structure generation foruniversal information extraction[J].arXiv:2203.12277,2022
[12]BAO X,WANG Z,JIANG X,et al.Aspect-based SentimentAnalysis with Opinion Tree Generation[C]//IJCAI.2022:4044-4050.
[13]TANG D,QIN B,FENG X,et al.Effective LSTMs for target-dependent sentiment classification[J].arXiv:1512.01100,2015.
[14]BU J,REN L,ZHENG S,et al.ASAP:A Chinese review dataset towards aspect category sentiment analysis and rating prediction[J].arXiv:2103.06605,2021.
[15]CHEN Z,QIAN T.Relation-aware collaborative learning for unified aspect-based sentiment analysis[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.2020:3685-3694.
[16]ZHANG M,QIAN T.Convolution over hierarchical syntacticand lexical graphs for aspect level sentiment analysis[C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing(EMNLP).2020:3540-3549.
[17]GOU Z,GUO Q,YANG Y.Mvp:Multi-view prompting im-proves aspect sentiment tuple prediction[J].arXiv:2305.12627,2023.
[18]YU J,GUO Y,RUAN Q M.Aspect-level sentiment triple extraction based on set prediction[J].Journal of Chinese Information Processing,2024,38(8):147-157.
[19]WANG Q,DING K,LUO X,et al.Improving in-context learning via sequentially selection and preference alignment for few-shot aspect-based sentiment analysis[C]//Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval.2024:2462-2466.
[20]VACAREANU R,VARIA S,HALDER K,et al.A Weak Supervision Approach for Few-Shot Aspect Based Sentiment[J].arXiv:2305.11979,2023.
[21]GOODING S.On the ethical considerations of text simplification[J].arXiv:2204.09565,2022.
[22]BIRAN O,BRODY S,ELHADAD N.Putting it simply:a context-aware approach to lexical simplification[C]//Proceedings of the 49th Annual Meeting of the Association for Computa-tional Linguistics:Human Language Technologies.2011:496-501.
[23]GOODING S,KOCHMAR E.Recursive context-aware lexicalsimplification[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing(EMNLP-IJCNLP).2019:4853-4863.
[24]SIDDHARTHAN A.Syntactic simplification and text cohesion[J].Research on Language and Computation,2006,4(1):77-109.
[25]ALUÍSIO S,GASPERIN C.Fostering digital inclusion and accessibility:the PorSimples project for simplification of Portuguese texts[C]//Proceedings of the NAACL HLT 2010 Young Investigators Workshop on Computational Approaches to Languages of the Americas.2010:46-53.
[26]BURSTEIN J,HORBACH A,KOCHMAR E,et al.Proceedings of the 16th Workshop on Innovative Use of NLP for Building Educational Applications[C]//Proceedings of the 16th Workshop on Innovative Use of NLP for Building Educational Applications.2021.
[27]DEVARAJ A,SHEFFIELD W,WALLACE B C,et al.Evaluating factuality in text simplification[C]//Proceedings of the conference.Association for Computational Linguistics.Meeting.2022,2022:7331.
[28]XIAO Z H,CHENG M M,GONG J F,et al.Research on Chinese vocabulary simplification based on hint fine-tuning[J].Journal of Chinese Information Processing,2024,38(8):34-43.
[29]SUN R,XU W,WAN X.Teaching the pre-trained model to generate simple texts for text simplification[J].arXiv:2305.12463,2023.
[30]ANSCHÜTZ M,OEHMS J,WIMMER T,et al.Language mo-dels for German text simplification:Overcoming parallel data scarcity through style-specific pre-training[J].arXiv:2305.12908,2023.
[31]TRIENES J,JOSEPH S,SCHLÖTTERER J,et al.Info-LossQA:Characterizing and recovering information loss in text simplification[J].arXiv:2401.16475,2024.
[32]JIANG X,YOU P,CHEN C,et al.Exploring scope detection for aspect-based sentiment analysis[J].IEEE/ACM Transactions on Audio,Speech,and Language Processing,2023,32:83-94.
[33]LIU D,NASSERELDINE A,YANG Z,et al.Large languagemodels have intrinsic self-correction ability[J].arXiv:2406.15673,2024.
[34]FLESCH R.A new readability yardstick[J].Journal of applied psychology,1948,32(3):221.
[35]VASWANI A,SHAZEER N,PARMAR N,et al.Attention isall you need[C]//Proceedings of the 31st International Confe-rence on Neural Information Processing Systems.2017:6000-6010.
[36]CAI H,XIA R,YU J.Aspect-category-opinion-sentiment quadruple extraction with implicit aspects and opinions[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing.2021:340-350.
[37]ZHOU J,YANG H,HE Y,et al.A unified one-step solution for aspect sentiment quad prediction[J].arXiv:2306.04152,2023.
[38]XU L,LI H,LU W,et al.Position-aware tagging for aspect sentiment triplet extraction[J].arXiv:2010.02609,2020.
[39]PEPER J J,WANG L.Generative aspect-based sentiment analysis with contrastive learning and expressive structure[J].arXiv:2211.07743,2022.
[40]ZHANG W,ZHANG X,CUI S,et al.Adaptive data augmentation for aspect sentiment quad prediction[C]//ICASSP 2024-2024 IEEE International Conference on Acoustics,Speech and Signal Processing(ICASSP).IEEE,2024:11176-11180.
[41]MA T,WANG Z,ZHOU G.Transition-based Opinion Generation for Aspect-based Sentiment Analysis[C]//Findings of the Association for Computational Linguistics ACL 2024.2024:3078-3087.
[42]BROWN T,MANN B,RYDER N,et al.Language models arefew-shot learners[J].Advances in Neural Information Proces-sing Systems,2020,33:1877-1901.
[43]TOUVRON H,MARTIN L,STONE K,et al.LLaMA 2:Open foundation and fine-tuned chat models[J].arXiv:2307.09288,2023.
[1] WANG Baocai, WU Guowei. Interpretable Credit Risk Assessment Model:Rule Extraction Approach Based on AttentionMechanism [J]. Computer Science, 2025, 52(10): 50-59.
[2] ZHENG Hanyuan, GE Rongjun, HE Shengji, LI Nan. Direct PET to CT Attenuation Correction Algorithm Based on Imaging Slice Continuity [J]. Computer Science, 2025, 52(10): 115-122.
[3] XU Hengyu, CHEN Kun, XU Lin, SUN Mingzhai, LU Zhou. SAM-Retina:Arteriovenous Segmentation in Dual-modal Retinal Image Based on SAM [J]. Computer Science, 2025, 52(10): 123-133.
[4] WEN Jing, ZHANG Songsong, LI Xufeng. Target Tracking Method Based on Cross Scale Fusion of Features and Trajectory Prompts [J]. Computer Science, 2025, 52(10): 144-150.
[5] SHENG Xiaomeng, ZHAO Junli, WANG Guodong, WANG Yang. Immediate Generation Algorithm of High-fidelity Head Avatars Based on NeRF [J]. Computer Science, 2025, 52(10): 159-167.
[6] ZHENG Dichen, HE Jikai, LIU Yi, GAO Fan, ZHANG Dengyin. Low Light Image Adaptive Enhancement Algorithm Based on Retinex Theory [J]. Computer Science, 2025, 52(10): 168-175.
[7] RUAN Ning, LI Chun, MA Haoyue, JIA Yi, LI Tao. Review of Quantum-inspired Metaheuristic Algorithms and Its Applications [J]. Computer Science, 2025, 52(10): 190-200.
[8] XIONG Zhuozhi, GU Zhouhong, FENG Hongwei, XIAO Yanghua. Subject Knowledge Evaluation Method for Language Models Based on Multiple ChoiceQuestions [J]. Computer Science, 2025, 52(10): 201-207.
[9] WANG Jian, WANG Jingling, ZHANG Ge, WANG Zhangquan, GUO Shiyuan, YU Guiming. Multimodal Information Extraction Fusion Method Based on Dempster-Shafer Theory [J]. Computer Science, 2025, 52(10): 208-216.
[10] CHEN Yuyan, JIA Jiyuan, CHANG Jingwen, ZUO Kaiwen, XIAO Yanghua. SPEAKSMART:Evaluating Empathetic Persuasive Responses by Large Language Models [J]. Computer Science, 2025, 52(10): 217-230.
[11] LI Sihui, CAI Guoyong, JIANG Hang, WEN Yimin. Novel Discrete Diffusion Text Generation Model with Convex Loss Function [J]. Computer Science, 2025, 52(10): 231-238.
[12] ZHANG Jiawei, WANG Zhongqing, CHEN Jiali. Multi-grained Sentiment Analysis of Comments Based on Text Generation [J]. Computer Science, 2025, 52(10): 239-246.
[13] CHEN Jiahao, DUAN Liguo, CHANG Xuanwei, LI Aiping, CUI Juanjuan, HAO Yuanbin. Text Sentiment Classification Method Based on Large-batch Adversarial Strategy and EnhancedFeature Extraction [J]. Computer Science, 2025, 52(10): 247-257.
[14] ZHAO Jinshuang, HUANG Degen. Summary Faithfulness Evaluation Based on Data Augmentation and Two-stage Training [J]. Computer Science, 2025, 52(10): 266-274.
[15] SUN Liangxu, LI Linlin, LIU Guoli. Sub-problem Effectiveness Guided Multi-objective Evolution Algorithm [J]. Computer Science, 2025, 52(10): 296-307.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!