Computer Science ›› 2020, Vol. 47 ›› Issue (4): 112-118.doi: 10.11896/jsjkx.190200342

Special Issue: Medical Imaging

• Computer Graphics & Multimedia • Previous Articles     Next Articles

Approach to Classification of Eye Movement Directions Based on EEG Signal

CHENG Shi-wei1, CHEN Yi-jian1, XU Jing-ru1, ZHANG Liu-xin2, WU Jian-feng3, SUN Ling-yun4   

  1. 1 School of Computer Science and Technology,Zhejiang University of Technology,Hangzhou 310023,China;
    2 Lenovo Research,Beijing 100085,China;
    3 Institute of Industrial Design,Zhejiang University of Technology,Hangzhou 310023,China;
    4 State Key Lab of CAD&CG,Zhejiang University,Hangzhou 310027,China
  • Received:2019-04-26 Online:2020-04-15 Published:2020-04-15
  • Contact: CHENG Shi-wei,born in 1981,Ph.D,professor,Ph.D supervisor,is senior member of China Computer Federation.His main research interests include human-computer interaction.
  • Supported by:
    This work was supported by the National Key Research & Development Program of China (2016YFB1001403) and National Natural Science Foundation of China (61772468,61672451).

Abstract: In order to improve the accuracy of eye movement directions identification based on electro-oculogram (EOG) signals,this paper utilized the electrooculogram (EEG) signals containing EOG artifacts and proposed a new approach to classify eye movement directions.Firstly,EEG signals from the 8 channels in the frontal lobe of the human brain are collected,and EEG data pre-processing is made ,including data normalization and least squares based denoising.Then support vector machine based methodis applied to perform multiple binary-classification,and finally voting strategy is used to solve four-classification problems,thus achieving eye movement directions identification.The experiment results show when using the approach of this paper to classify eye movement directions,the classification accuracy rates in the upper,lower,left and right directions are 78.47%,72.22%,84.03%,79.86% respectively,and the average classification accuracy rates reach 78.65%.In addition,compared with the existed classification methods,the classification accuracy rate of this paper is higher,and the classification algorithm is simpler.It is validated the feasibility and effectiveness of using EEG signals to identify eye movement directions.

Key words: Brain-computer interface, Electroencephalogram, Electro-oculogram, Eye tracking, Human-computer interaction

CLC Number: 

  • TP311
[1]SCHALK G,MCFARLAND D J,HINTERBERGER T,et al.BCI2000:a general-purpose brain-computer interface (BCI) system [J].IEEE Transactions on Biomedical Engineering,2004,51(6):1034-1043.
[2]BELKACEM A N,SHIN D,KAMBARA H,et al.Online classification algorithm for eye-movement-based communication systems using two temporal EEG sensors[J].Biomedical Signal Processing & Control,2015,16:40-47.
[3]BAI X,WANG X,ZHENG S,et al.The offline feature extraction of four-class motor imagery EEG based on ICA and Wavelet-CSP[C]// Proceeding of the 33rd Chinese Control Confe-rence (CCC).Nanjing:IEEE,2014:7189-7194.
[4]WANG L,WU X P,GAO X P.Analysis and classification of four-class motor imagery EEG data[J].Computer Technology and Development,2008,18(10):23-26.
[5]BAREA R,BOQUETE L,MAZO M,et al.Wheelchair guidance strategies using EOG [J].Journal of Intelligent & Robotic Systems,2002,34(3):279-299.
[6]FATOURECHI M,BASHASHATI A,WARD R K,et al.EMG and EOG artifacts in brain computer interface systems:A survey[J].Clinical Neurophysiology,2007,118(3):480-494.
[7]KUMAR B K,PRASAD K,ALEKHYA D.Performance comparision of various thresholding techniques on the removal of ocular artifacts in The EEG signals[C]//2016 International Confe-rence on Inventive Computation Technologies (ICICT).Coimbatore:IEEE,2016:1-5.
[8]PLÖCHL M,OSSANDMÒN J P,KÖNIG P.Combining EEGand eye tracking:identification,characterization,and correction of eye movement artifacts in electroencephalographic data [J].Frontiers in Human Neuroscience,2012,6:278.
[9]DIMIGEN O,SOMMER W,HOHLFELD A,et al.Coregistration of eye movements and EEG in natural reading:analyses and review [J].Journal of Experimental Psychology:General,2011,140(4):552.
[10]GOMEZ-GIL J,SAN-JOSE-GONZALEZ I,NICOLAS-ALONSO L F,et al.Steering a tractor by means of an EMG-based human-machine interface [J].Sensors,2011,11(7):7110-7126.
[11]YAGI T.Eye-gaze interfaces using electro-oculography (EOG)[C]// The 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction.Hong Kong:ACM,2010:28-32.
[12]AI G,SATO N,SINGH B,et al.Direction and viewing area-sensitive influence of EOG artifacts revealed in the EEG topographicpattern analysis [J].Cogn Neurodyn,2016,10(4):301-314.
[13]ROBERT H,KISE K,AUGEREAU O.Real-time wordometer demonstration using commercial EoG glasses[C]//Proceedings of the 2017 ACM International Symposium on Wearable Computers.ACM,2017:277-280.
[14]LEE J,YEO H S,STARNER T,et al.Automated Data Gathering and Training Tool for Personalized Itchy Nose[C]//Proceedings of the 9th Augmented Human International Conference.ACM,2018:1-3.
[15]BELKACEM A N,HIROSE H,YOSHIMURA N,et al.Classification of four eye directions from EEG signals for eye-movement-based communication systems [J].Journal of Medical & Biological Engineering,2014,34(6):581-588[16]VAPNIK V N,VAPNIK V.Statistical learning theory [M].New York:Wiley,1998.
[17]SOMAN S.Learning from low training data using classifierswith derivative constraints[C]//Proceedings of the ACM India Joint International Conference on Data Science and Management of Data.ACM,2019:86-93.
[18]ZHANG X G.Introduction to statistical learning theory andsupport vector machines[J].Acta Automatica Sinica,2000,26(1):32-42.
[19]SUN H,ZHANG Y,GLUCKMAN B J,et al.Optimal-channel selection algorithms in mental tasks based brain-computer interface[C]// Proceedings of the 2018 8th International Conference on Bioscience,Biochemistry and Bioinformatics.New York:ACM,2018:118-123.
[20]RAMOSER H,MULLER-GERKING J,PFURTSCHELLERG.Optimal spatial filtering of single trial EEG during imagined hand movement [J].IEEE Transactions on Rehabilitation Engineering,2000,8(4):441-446.
[21]SU K,ROBBINS K A.Subject specific parameter selection for the EEG classifier using common spatial patterns[C]// Proceedings of the 2012 ACM Research in Applied Computation Symposium.New York:ACM,2012:68-69.
[22]ZHANG H,QIAO X Y.Research of fusion classification of EEG features for multi-Class motor imagery[J].Chinese Journal of Sensors and Actuators,2016,29(6):802-807.
[23]KAI K,KATSUTOSHI M,UEMA Y,et al.How much do you read?:counting the number of words a user reads using electrooculography[C]//Proceedings of the 6th Augmented Human International Conference.New York:ACM,2015:125-128.
[1] LI Sun, CAO Feng. Analysis and Trend Research of End-to-End Framework Model of Intelligent Speech Technology [J]. Computer Science, 2022, 49(6A): 331-336.
[2] GAO Yue, FU Xiang-ling, OUYANG Tian-xiong, CHEN Song-ling, YAN Chen-wei. EEG Emotion Recognition Based on Spatiotemporal Self-Adaptive Graph ConvolutionalNeural Network [J]. Computer Science, 2022, 49(4): 30-36.
[3] ZHANG Ji-kai, LI Qi, WANG Yue-ming, LYU Xiao-qi. Survey of 3D Gesture Tracking Algorithms Based on Monocular RGB Images [J]. Computer Science, 2022, 49(4): 174-187.
[4] DU Yu-lin, HUANG Zhang-rui, ZHAO Xin-can, LIU Chen-yang. SSVEP Stimulus Number Effect on Performance of Brain-Computer Interfaces in Augmented Reality Glasses [J]. Computer Science, 2021, 48(8): 309-314.
[5] ZHU Chen-shuang, CHENG Shi-wei. Prediction and Assistance of Navigation Demand Based on Eye Tracking in Virtual Reality Environment [J]. Computer Science, 2021, 48(8): 315-321.
[6] LIU Liang, PU Hao-yang. Real-time LSTM-based Multi-dimensional Features Gesture Recognition [J]. Computer Science, 2021, 48(8): 328-333.
[7] WU Guang-zhi, GUO Bin, DING Ya-san, CHENG Jia-hui, YU Zhi-wen. Cognitive Mechanisms of Fake News [J]. Computer Science, 2021, 48(6): 306-314.
[8] CHENG Shi-wei, QI Wen-jie. Dynamic Trajectory Based Implicit Calibration Method for Eye Tracking [J]. Computer Science, 2019, 46(8): 282-291.
[9] WANG Jing, SI Shu-jian. Attribute Revocable Access Control Scheme for Brain-Computer Interface Technology [J]. Computer Science, 2018, 45(9): 187-194.
[10] CHEN Tian-tian, YAO Huang, ZUO Ming-zhang, TIAN Yuan, YANG Meng-ting. Review of Dynamic Gesture Recognition Based on Depth Information [J]. Computer Science, 2018, 45(12): 42-51.
[11] LIU Zhe and LI Zhi. Research and Development of Computer-aided Requirements Engineering Tool Based on Multi-modal Interaction Technologies [J]. Computer Science, 2017, 44(4): 177-181.
[12] XIN Yi-zhong, MA Ying and YU Xia. Comparison of Direct and Indirect Pen Tilt Input Performance [J]. Computer Science, 2015, 42(9): 50-55.
[13] GAO Yan-fei, CHEN Jun-jie and QIANG Yan. Dynamic Scheduling Algorithm in Hadoop Platform [J]. Computer Science, 2015, 42(9): 45-49.
[14] WANG Yu, REN Fu-ji and QUAN Chang-qin. Review of Dialogue Management Methods in Spoken Dialogue System [J]. Computer Science, 2015, 42(6): 1-7.
[15] HE Zheng-hai and LI Zhi. Research and Development of Computer-aided Requirements Analysis Tool Based on Human-computer Interaction [J]. Computer Science, 2015, 42(12): 181-183.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!