计算机科学 ›› 2025, Vol. 52 ›› Issue (5): 122-127.doi: 10.11896/jsjkx.240200039
张嘉翔, 潘敏, 张瑞
ZHANG Jiaxiang, PAN Min, ZHANG Rui
摘要: 脑电情绪识别是指通过分析人类脑电信号来识别相应情绪状态的技术,其在医疗健康、人机交互等领域有着广泛的应用。目前,脑电情绪识别往往借助机器学习或深度学习方法对标签脑电数据进行充分训练从而能够辨别不同情绪状态。然而,以往方法严重依赖于大量标签数据,而数据标注耗时耗力,并且脑电信号的个体差异性导致传统方法表现不佳。同时,研究表明,脑电信号的空间结构信息能够反映不同情绪状态下蕴含的脑区相互作用,有助于提高情绪的辨识度。为此,提出了一种基于自监督图网络的脑电情绪识别方法。首先,使用减数分裂方法预处理脑电信号;其次,利用图卷积网络提取脑电信号的空间结构信息,并设计自监督辅助任务对图卷积网络进行训练;最后,在公开数据集SEED和SEED-IV上验证所提方法的可行性和有效性,其情绪识别准确率为95.16%和80.23%,优于现有方法。
中图分类号:
[1]DOLAN R J.Emotion,cognition,and behavior[J].Science,2002,298(5596):1191-1194. [2]ROY Y,BANVILLE H,ALBUQUERQUE I,et al.Deep learning-based electroencephalography analysis:a systematic review[J].Journal of Neural Engineering,2019,16(5):051001. [3]LIU Y,SOURINA O,NGUYEN M.Real-time EEG-based emotion recognition and its applications[J].Transactions on Computational Science XII:Special Issue on Cyberworlds,2011,6670:256-277. [4]EKMAN P E,DAVIDSON R J.The nature of emotion:Fundamental questions[M].Oxford University Press,1994:62-65. [5]AZCARATE A,HAGELOH F,VAN DE SANDEK,et al.Automatic facial emotion recognition[EB/OL]. https://www.semanticscholar.org/paper/Automatic-facial-emotion-recognition-Azcarate-Sande/15c11294ea3fe1c48d1b9cff0b06d74143ea264a. [6]SCHULLER B,RIGOLL G,LANG M.Speech emotion recognition combining acoustic features and linguistic information in a hybrid support vector machine-belief network architecture[C]//2004 IEEE International Conference on Acoustics,Speech,and Signal Processing.IEEE,2004,1:1-577. [7]ZHENG W L,LU B L.Investigating critical frequency bandsand channels for EEG-based emotion recognition with deep neural networks[J].IEEE Transactions on Autonomous Mental Development,2015,7(3):162-175. [8]LI Y,CHEN J,LI F,et al.GMSS:Graph-based multi-task self-supervised learning for EEG emotion recognition[J].IEEE Transactions on Affective Computing,2022,14(3):2512-2525. [9]WU F,SOUZA A,ZHANG T,et al.Simplifying graph convolutional networks[C]//International Conference on Machine Learning.PMLR,2019:6861-6871. [10]ZHONG P,WANG D,MIAO C.EEG-based emotion recognition using regularized graph neural networks[J].IEEE Transactions on Affective Computing,2020,13(3):1290-1301. [11]DUAN R N,ZHU J Y,LU B L.Differential entropy feature for EEG-based emotion classification[C]//2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).IEEE,2013:81-84. [12]NIE D,WANG X W,SHI L C,et al.EEG-based emotion recognition during watching movies[C]//2011 5th International IEEE/EMBS Conference on Neural Engineering.IEEE,2011:667-670. [13]WU Z,PAN S,CHEN F,et al.A comprehensive survey ongraph neural networks[J].IEEE Transactions on Neural Networks and Learning Systems,2020,32(1):4-24. [14]ZHAO L M,YAN X,LU B L.Plug-and-play domain adaptation for cross-subject EEG-based emotion recognition[C]//Procee-dings of the AAAI Conference on Artificial Intelligence.2021:863-870. [15]KAN H,YU J,HUANG J,et al.Self-supervised group meiosis contrastive learning for EEG-based emotion recognition[J].Applied Intelligence,2023,53(22):27207-27225. [16]HE K,FAN H,WU Y,et al.Momentum contrast for unsupervised visual representation learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.2020:9729-9738. [17]CHEN T,KORNBLITH S,NOROUZI M,et al.A simpleframework for contrastive learning of visual representations[C]//International Conference on Machine Learning.PMLR,2020:1597-1607. [18]LI Y,WANG L,ZHENG W,et al.A novel bi-hemispheric discrepancy model for EEG emotion recognition[J].IEEE Transactions on Cognitive and Developmental Systems,2020,13(2):354-367. [19]KHOSLA P,TETERWAK P,WANG C,et al.Supervised con-trastive learning[J].Advances in Neural Information Processing Systems,2020,33:18661-18673. [20]SHEN X,LIU X,HU X,et al.Contrastive learning of subject-invariant EEG representations for cross-subject emotion recognition[J].IEEE Transactions on Affective Computing,2022,14(3):2496-2511. [21]MOHSENVAND M N,IZADI M R,MAES P.Contrastive re-presentation learning for electroencephalogram classification[C]//Machine Learning for Health.PMLR,2020:238-253. |
|