计算机科学 ›› 2021, Vol. 48 ›› Issue (12): 312-318.doi: 10.11896/jsjkx.201000141

• 人工智能 • 上一篇    下一篇

融合频率和通道卷积注意的脑电(EEG)情感识别

柴冰1,2, 李冬冬1,2, 王喆1, 高大启1   

  1. 1 华东理工大学信息科学与工程学院 上海200237
    2 苏州大学计算机信息处理技术重点实验室 江苏 苏州215006
  • 收稿日期:2020-10-24 修回日期:2021-03-20 出版日期:2021-12-15 发布日期:2021-11-26
  • 通讯作者: 高大启(gaodaqi@ecust.edu.cn)
  • 作者简介:2271987116@qq.com
  • 基金资助:
    国家自然科学基金(61806078,62076094,61976091);上海市教育发展基金会和上海市教育委员会“曙光计划”(61725301);国家重大新药开发科技专项(2019ZX09201004);上海市科技计划项目(20511100600)

EEG Emotion Recognition Based on Frequency and Channel Convolutional Attention

CHAI Bing1,2, LI Dong-dong1,2, WANG Zhe1, GAO Da-qi1   

  1. 1 School of Information Science and Engineering,East China University of Science and Technology,Shanghai 200237,China
    2 Provincial Key Laboratory of Computer Information Processing Technology,Soochow University,Suzhou,Jiangsu 215006,China
  • Received:2020-10-24 Revised:2021-03-20 Online:2021-12-15 Published:2021-11-26
  • About author:CHAI Bing,born in 1996,postgraduate,is a member of China Computer Federation.Her main research interests include deep learning and emotion recognition.
    GAO Da-qi,born in 1957,Ph.D,professor.His main research interests include machine learning and pattern recognition.
  • Supported by:
    National Natural Science Foundation of China(61806078,62076094,61976091),“Shuguang Program” Supported by Shanghai Education Development Foundation and Shanghai Municipal Education Commission(61725301),National Major Scientific and Technological Special Project(2019ZX09201004) and Shanghai Science and Technology Program(20511100600).

摘要: 现有的脑电(EEG)情感识别研究普遍采用神经网络和单一注意机制来学习情感特征,具有相对单一的特征表示。而神经科学研究表明,不同频率和电极通道的脑电信号对情感有不同的响应程度,因此文中提出了一种融合频率和电极通道卷积注意的方法,用于脑电情感识别。具体来说,首先将EEG信号分解到不同的频带上并提取相应的帧级特征,然后用预激活残差网络来学习深层次的脑电情感相关特征,同时在残差网络的每个预激活残差单元中都融入频率和电极通道卷积注意模块,以建模脑电信号的频率和电极通道信息,并生成脑电特征的最终注意表示。在DEAP和DREAMER数据集上的独立于受试者场景下的实验结果表明,所提出的卷积注意方法相比单一注意机制更有助于增强EEG信号中情感显著信息的导入,并且能产生更好的情感识别结果。

关键词: 脑电情感识别, 特征表示, 残差网络, 预激活残差单元, 频率和电极通道卷积注意

Abstract: The existing emotion recognition researches generally use neural network and attention mechanism to learn emotional features,which have relatively single feature representation.Moreover,neuroscience studies have shown that EEG signals of different frequencies and channels have different responses to emotion.Therefore,this paper proposes a method of fusing frequency and electrode channel convolutional attention for EEG emotion recognition.Specifically,EEG signals are firstly decomposed into different frequency bands and the corresponding frame-level features are extracted.Then the pre-activated residual network is employed to learn deep emotion-relevant features.At the same time,the frequency and electrode channel convolutional attention module is integrated into each pre-activated residual unit of residual network to model the frequency and channel information of EEG signals,thus generating final representation of EEG features.Experiments on DEAP and DREAMER datasets show that the proposed method helps to enhance the importing of emotion-salient information in EEG signals when compared with single-layer attention mechanism,and generates better recognition performance.

Key words: EEG emotion recognition, Feature representation, Residual network, Pre-activated residual unit, Frequency and electrode channel convolutional attention

中图分类号: 

  • TP301
[1]JAIN D,SHAMSOLMOALI P,SEHDEV P.Extended deep neural network for facial emotion recognition[J].Pattern Recognition Letters,2019,120:69-74.
[2]LIU Z,WU M,CAO W,et al.Speech emotion recognition based on feature selection and extreme learning machine decision tree[J].Neurocomputing,2018,273:271-280.
[3]GU Y,CHEN S,MARSIC I.Deep multimodal learning for emotion recognition in spoken language[C]//IEEE International Conference on Acoustics,Speech and Signal Processing.Calgary:IEEE Press,2018:5079-5083.
[4]ALARCAO S,FONSECA M.Emotions recognition using EEG signals:A survey[J].IEEE Transactions on Affective Computing,2019,10(3):374-393.
[5]HAMADA M,ZAIDAN B,ZAIDAN A.A systematic review for human EEG brain signals based emotion classification,feature extraction,brain condition,group comparison[J].Journal of Medical Systems,2018,42(9):1-25.
[6]KNYAZEV G G.EEG delta oscillations as a correlate of basic homeostatic and motivational processes[J].Neuroscience & Biobehavioral Reviews,2012,36(1):677-695.
[7]SANDE M C,KNUT E,PETER H,et al.EEG theta power is an early marker of cognitive decline in dementia due to Alzheimer's disease[J].Journal of Alzheimer's Disease,2018,64(4):1359-1371.
[8]FINK A,ROMINGER C,BENEDEK M,et al.EEG alpha activity during imagining creative moves in soccer decision-making situations[J].Neuropsychologia,2018,114:118-124.
[9]GOLA M,MAGNUSKI M,SZUMSKA I,et al.EEG beta band activity is related to attention and attentional deficits in the visualperformance of elderly subjects[J].International Journal of Psychophysiology,2013,89(3):334-341.
[10]MAFFEI A,SPIRONELLI C,ANGRILLI A.Affective and cor- tical EEG gamma responses to emotional movies in women with high vs low traits of empathy[J].Neuropsychologia,2019,133:107175.
[11]HADJIDIMITRIOU S,HADJILEONTIADIS L.EEG-based classification of music appraisal responses using time-frequency analysis and familiarity ratings[J].IEEE Transactions on Affective Computing,2013,4(2):161-172.
[12]MOHAMMADI Z,FROUNCHI J,AMIRI M.Wavelet-based emotion recognition system using EEG signal[J].Neural Computing and Applications,2017,28(8):1985-1990.
[13]PIHO L,TJAHJADI T.A mutual information based adaptive windowing of informative EEG for emotion recognition[J].IEEE Transactions on Affective Computing,2020,11(4):722-735.
[14]TRIPATHI S,ACHARYA S,SHARMA R,et al.Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset[C]//Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence.IEEE Press,2017:4746-4752.
[15]LI Y,ZHENG W,ZONG Y,et al.A bi-hemisphere domain adversarial neural network model for EEG emotion recognition[J].IEEE Transactions on Affective Computing,2021,12(2):494-504.
[16]SONG T,ZHENG W,SONG P,et al.EEG emotion recognition using dynamical graph convolutional neural networks[J].IEEE Transactions on Affective Computing,2020,11(3):532-541.
[17]PANDEY P,SEEJA K.Subject independent emotion recognition from EEG using VMD and deep learning[J].Journal of King Saud University-Computer and Information Sciences,2019,https://doi.org/10.1016/j.jksuci.2019.11.003.
[18]SHARMA R,PACHORI R,SIRCAR P.Automated emotion recognition based on higher order statistics and deep learning algorithm[J].Biomedical Signal Processing and Control,2020,58:101867.
[19]ZHANG D,YAO L,CHEN K,et al.A convolutional recurrent attention model for subject-independent EEG signal analysis[J].IEEE Signal Processing Letters,2019,26(5):715-719.
[20]CHEN J,JIANG D,ZHANG Y.A hierarchical bidirectional GRU model with attention for EEG-based emotion classification[J].IEEE Access,2019,7:118530-118540.
[21]QIU J,LI X,HU K.Correlated attention networks for multimodal emotion recognition[C]//IEEE International Conference on Bioinformatics and Biomedicine.IEEE Press,2018:2656-2660.
[22]DUAN R,ZHU J,LU B.Differential entropy feature for EEG-based emotion classification[C]//2013 6th International IEEE/EMBS Conference on Neural Engineering(NER).IEEE Press,2013:81-84.
[23]WOO S,PARK J,LEE J.CBAM:Convolutional block attention module[C]//Proceedings of the European Conference on Computer Vision(ECCV).2018.
[24]HE K,ZHANG X,REN S,et al.Identity mappings in deep residual networks[C]//European Conference on Computer Vision.Cham:Springer,2016:630-645.
[25]KOELSTRA S,MUHL C,SOLEYMANI M,et al.DEAP:A database for emotion analysis using physiological signals[J].IEEE Transactions on Affective Computing,2011,3(1):18-31.
[26]KATSIGIANNIS S,RAMZAN N.Dreamer:A database for emotion recognition through EEG and ECG signals from wire-less low-cost off-the-shelf devices[J].IEEE Journal of Biome-dical and Health Informatics,2017,22(1):98-107.
[27]ROZGIC V,VITALADEVUNI S,PRASAD R.Robust EEG emotion classification using segment leveldecision fusion[C]//IEEE International Conference on Acoustics,Speech and Signal Processing.Vancouver:IEEE Press,2013:1286-1290.
[1] 郭琳, 李晨, 陈晨, 赵睿, 范仕霖, 徐星雨. 基于通道注意递归残差网络的图像超分辨率重建[J]. 计算机科学, 2021, 48(8): 139-144.
[2] 许华杰, 张晨强, 苏国韶. 基于深层卷积残差网络的航拍图建筑物精确分割方法[J]. 计算机科学, 2021, 48(8): 169-174.
[3] 暴雨轩, 芦天亮, 杜彦辉, 石达. 基于i_ResNet34模型和数据增强的深度伪造视频检测方法[J]. 计算机科学, 2021, 48(7): 77-85.
[4] 王建明, 黎向锋, 叶磊, 左敦稳, 张丽萍. 基于信道注意结构的生成对抗网络医学图像去模糊[J]. 计算机科学, 2021, 48(6A): 101-106.
[5] 牛康力, 谌雨章, 张龚平, 谭前程, 王绎冲, 罗美琪. 基于深度学习的无人机航拍车流量监测[J]. 计算机科学, 2021, 48(6A): 275-280.
[6] 龚航, 刘培顺. 夜间行驶车辆远光灯检测方法[J]. 计算机科学, 2021, 48(12): 256-263.
[7] 杨月麟, 毕宗泽. 基于深度学习的网络流量异常检测[J]. 计算机科学, 2021, 48(11A): 540-546.
[8] 刘遵雄, 朱成佳, 黄稷, 蔡体健. 多跳连接残差注意网络的图像超分辨率重建[J]. 计算机科学, 2021, 48(11): 258-267.
[9] 王润正, 高见, 黄淑华, 仝鑫. 基于知识蒸馏的恶意代码家族检测方法[J]. 计算机科学, 2021, 48(1): 280-286.
[10] 李泽文, 李子铭, 费天禄, 王瑞琳, 谢在鹏. 基于残差生成对抗网络的人脸图像复原[J]. 计算机科学, 2020, 47(6A): 230-236.
[11] 肖潇, 孔凡芝. 三角坐标系下人脸表情表示方法[J]. 计算机科学, 2020, 47(6A): 250-253.
[12] 宋娅菲, 谌雨章, 沈君凤, 曾张帆. 基于改进残差网络的水下图像重建方法[J]. 计算机科学, 2020, 47(6A): 500-504.
[13] 朱威, 王图强, 陈悦峰, 何德峰. 基于多尺度残差网络的对象级边缘检测算法[J]. 计算机科学, 2020, 47(6): 144-150.
[14] 乔梦雨, 王鹏, 吴娇, 张宽. 面向陆战场目标识别的轻量级卷积神经网络[J]. 计算机科学, 2020, 47(5): 161-165.
[15] 龚扣林, 周宇, 丁笠, 王永超. 基于BiLSTM模型的漏洞检测[J]. 计算机科学, 2020, 47(5): 295-300.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] 向林泓,张炬,孙启龙,赵学良. 基于Relative-IDF的医药数据相似度算法研究[J]. 计算机科学, 2014, 41(Z6): 417 -420 .
[2] 卢风顺,宋君强,银福康,张理论. CPU/GPU协同并行计算研究综述[J]. 计算机科学, 2011, 38(3): 5 -9 .
[3] 王乐乐, 汪斌强, 刘建港, 苗启广. 恶意行为图构建与匹配算法研究[J]. 计算机科学, 2021, 48(4): 309 -315 .
[4] 潘孝勤, 芦天亮, 杜彦辉, 仝鑫. 基于深度学习的语音合成与转换技术综述[J]. 计算机科学, 2021, 48(8): 200 -208 .
[5] 王俊, 王修来, 庞威, 赵鸿飞. 面向科技前瞻预测的大数据治理研究[J]. 计算机科学, 2021, 48(9): 36 -42 .
[6] 余力, 杜启翰, 岳博妍, 向君瑶, 徐冠宇, 冷友方. 基于强化学习的推荐研究综述[J]. 计算机科学, 2021, 48(10): 1 -18 .
[7] 王梓强, 胡晓光, 李晓筱, 杜卓群. 移动机器人全局路径规划算法综述[J]. 计算机科学, 2021, 48(10): 19 -29 .
[8] 高洪皓, 郑子彬, 殷昱煜, 丁勇. 区块链技术专题序言[J]. 计算机科学, 2021, 48(11): 1 -3 .
[9] 毛瀚宇, 聂铁铮, 申德荣, 于戈, 徐石成, 何光宇. 区块链即服务平台关键技术及发展综述[J]. 计算机科学, 2021, 48(11): 4 -11 .
[10] 鲁淑霞, 张振莲. 基于最优间隔的AdaBoostv算法的非平衡数据分类[J]. 计算机科学, 2021, 48(11): 184 -191 .