Computer Science ›› 2019, Vol. 46 ›› Issue (8): 282-291.doi: 10.11896/j.issn.1002-137X.2019.08.047

• Artificial Intelligence • Previous Articles     Next Articles

Dynamic Trajectory Based Implicit Calibration Method for Eye Tracking

CHENG Shi-wei, QI Wen-jie   

  1. (School of Computer Science and Technology,Zhejiang University of Technology,Hangzhou 310023,China)
  • Received:2019-04-28 Online:2019-08-15 Published:2019-08-15

Abstract: Aiming at the limitations of the existing multi-point calibration schemes,such as time-consuming and poor gaze accuracy of simplified calibration schemes,this paper proposed an implicit calibration method for eye tracking,which makes the eye tracking system only need a few samples to establish an accurate mapping relationship.The me-thod has three steps.Firstly,it collects calibration data.With user’s gaze following the dynamictrajectory,it records the mapping points between the image of user’s eyes and the calibration points in this process.Then,a rationalized outlier removal method is proposed to automatically eliminate the sample noise and select the best point pair to establish a mapping model.The acquisition of eye movement data is delayed,which can reduce the error caused by the dynamic traje-ctory.Furthermore,when the noise data of the sample are removed,a method for eliminating the pupil error data is proposed,and the sample data are further filtered by Random Sample Consensus (RANSAC) algorithm.Finally,the two methods of calibration-free and single-point calibration are combined to simplify the subsequence implicit calibration process.Experiment results show that the average calibration time is 8 s,and the average accuracy is 2.0° of visual angle when visual distance is 60 cm.In the simplified implicit calibration prototype system,for the user who is calibrated,the coordinates of the fixation are obtained by the calibration-free method.The average calibration time is 2 s,and the ave-rage calibration accuracy is 2.47° of visual angle.For the new user who performs eye tracking,the individual difference compensation model is calculated by the single point calibration method to obtain the coordinates of the fixation.The ave-rage calibration time is 3 s,and the average calibration accuracy is 2.49° of visual angle,which further improves the practicality of the implicit calibration method

Key words: Calibration free, Eye tracking, Fixation, Human-computer interaction, Implicit calibration

CLC Number: 

  • TP391
[1]FUHL W,SANTINI T,GEISLER D,et al.Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios[C]∥Acm International Joint Conference on Pervasive & Ubiquitous Computing:Adjunct.ACM,2016:1656-1665.
[2]MORIMOTO C H,MARCIO R M.Eye gaze tracking techniques for interactive applications.Computer Vision and Image Understanding,2005,98(1):4-24.
[3]SUGANO Y,MATSUSHITA Y,SATO Y.Calibration-free gaze sensing using saliency maps[C]∥Computer Vision and Pattern Recognition.IEEE,2010:2667-2674.
[4]WOOD E,TADAS B,MORENCY L P,et al.A 3D Mor- phable Eye Region Model for Gaze Estimation[C]∥Euro-pean Conference on Computer Vision.Springer InternationalPublishing,2016:297-313.
[5]KHAMIS M,SALTUK O,HANG A,et al.TextPursuits:using text for pursuits-based interaction and calibration on public displays[C]∥Acm International Joint Conference on Pervasive & Ubiquitous Computing.ACM,2016:274-285.
[6]HUANG C W,TSENG S C,JIANG Z S,et al.Projective mapping compensation for the head movement during eye tracking[C]∥IEEE International Conference on Consumer Electronics-taiwan.IEEE,2014:2243-2246.
[7]SUGANO Y,ZHANG X,BULLING A.AggreGaze:Collective Estimation of Audience Attention on Public Displays[C]∥Symposium on User Interface Software and Technology.ACM,2016:821-831.
[8]STEIL J,HUANG M X,BULLING A.Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets[C]∥Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications.New York:ACM,2018:23.
[9]PAIVI M,ANDREAS B.Eye Tracking and Eye-Based Human-Computer Interaction∥Advances in physiological computing.London:Springer,2014: 39-65.
[10]DONGHENG L,DAVID W,DERRICK J P.Starburst:A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches[C]∥Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.IEEE,2005:2667-2674.
[11]PFEUFFER K,VIDAL M,TURNER J,et al.Pursuit calibration:Making gaze calibration less tedious and more flexible[C]∥Proceedings of the 26th annual ACM symposium on User interface software and technology.ACM,2013:261-270.
[12]SUGANO Y,BULLING A.Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency[C]∥The 28th Annual ACM Symposium.ACM,2015:363-372.
[13]HUANG M X,KWOK T C K,NGAI G,et al.Building a Personalized,Auto-Calibrating Eye Tracker from User Interactions[C]∥Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems.ACM, 2016:5169-5179.
[14]BERNET S,CUDEL C,LEFLOCH D,et al.Autocalibration- based partioning relationship and parallax relation for head-mounted eye trackers.Machine Vision and Applications,2013,24(2):393-406.
[15]SANTINI T,FUHL W,KASNECI E.CalibMe:Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction[C]∥Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems.ACM,2017:2594-2605.
[16]SUN F R,LIU J R.Fast Hough Transform Algorithm.Chinese Journal of Computers,2001,24(10):1102-1109.(in Chinese) 孙丰荣,刘积仁.快速霍夫变换算法.计算机学报,2001,24(10):1102-1109.
[17]YANG H.A study of Calibration Techniques on Eye Gaze Tracking System with Head Fixed.Xi’an:Xidian University,2015.(in Chinese) 杨慧.头部静止的视线跟踪系统中标定技术的研究.西安:西安电子科技大学,2015.
[18]KOBAYASHI T,TOYAMA T,SHAFAIT F,et al.Recognizing Words in Scenes with a Head-Mounted Eye-Tracker[C]∥Iapr International Workshop on Document Analysis Systems.IEEE Computer Society,2012:2142-2149.
[19]LIU K,GE J F,LUO Y P.Probability guided random sample consensus.Journal of Computer:Aided Design & Computer Graphics,2009,21(5):657-662.(in Chinese) 刘坤,葛俊锋,罗予频.概率引导的随机采样一致性算法.计算机辅助设计与图形学学报,2009,21(5):657-662.
[20]LAWRENCE H Y,EIZENMAN M.A new methodology for determining point-of-gaze in head-mounted eye tracking systems.IEEE Trans.Biomed.Engineering,2004,51(10):1765-1773.
[21]CERROLAZA J J,VILLANUEVA A,CABEZA R.Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers.ACM Transactions on Computer-Human Interaction (TOCHI),2012,19(2):1-25.
[22]HUANG G B,ZHU Q Y,SIEW C K.Extreme learning ma- chine.Theory and applications .Neurocomputing,2006,70:489-501.
[23]JIANG G Y.Research on data fusion of head movement and gaze tracking.Xi’an:Xi’an Technological University,2017.(in Chinese) 蒋光毅.头部运动与视线追踪数据融合技术的研究.西安:西安工业大学,2017.
[24]ESSIG K,POMPLUN M,RITTER H.A neural network for 3D gaze recording with binocular eye trackers.International Journal of Parallel,Emergent and Distributed Systems,2006,21(2):79-95.
[25]HOU S W.The Research of the Calibration Method in the head-mounted Eye Tracking Systems.Hefei:University of Scien-ce and Technology of China,2013.(in Chinese) 侯树卫.头戴式视线跟踪系统的标定方法研究.合肥:中国科学技术大学,2013.
[26]MORIMOTO C H,MIMICA M R M.Eye gaze tracking techniques for interactive applications.Computer Vision & Ima-ge Understanding,2005,98(1):4-24.
[27]CHENG S W,SUN Z Q,LU Y H.An eye tracking approach to cross-device interaction .Journal of Computor-Aided Design &Computor Graphics,2016,28(7):1094-1104.(in Chinese) 程时伟,孙志强,陆煜华.面向多设备交互的眼动跟踪方法.计算机辅助设计与图形学学报,2016,28(7):1094-1104.
[28]POLA J,WYATT H J.Target position and velocity:The stimuli for smooth pursuit eye movements.Vision Research,1980,20(6):523-534.
[29]ALNAJAR F,GEVERS T,VALENTI R,et al.Auto-Calibrated Gaze Estimation Using Human Gaze Patterns.International Journal of Computer Vision,2017,124(2):223-236.
[1] LI Sun, CAO Feng. Analysis and Trend Research of End-to-End Framework Model of Intelligent Speech Technology [J]. Computer Science, 2022, 49(6A): 331-336.
[2] ZHANG Ji-kai, LI Qi, WANG Yue-ming, LYU Xiao-qi. Survey of 3D Gesture Tracking Algorithms Based on Monocular RGB Images [J]. Computer Science, 2022, 49(4): 174-187.
[3] ZHU Chen-shuang, CHENG Shi-wei. Prediction and Assistance of Navigation Demand Based on Eye Tracking in Virtual Reality Environment [J]. Computer Science, 2021, 48(8): 315-321.
[4] LIU Liang, PU Hao-yang. Real-time LSTM-based Multi-dimensional Features Gesture Recognition [J]. Computer Science, 2021, 48(8): 328-333.
[5] LIU Xiang-yu, JIAN Mu-wei, LU Xiang-wei, HE Wei-kai, LI Xiao-feng, YIN Yi-long. Saliency Detection Based on Eye Fixation Prediction and Boundary Optimization [J]. Computer Science, 2021, 48(6A): 107-112.
[6] WU Guang-zhi, GUO Bin, DING Ya-san, CHENG Jia-hui, YU Zhi-wen. Cognitive Mechanisms of Fake News [J]. Computer Science, 2021, 48(6): 306-314.
[7] CHENG Shi-wei, CHEN Yi-jian, XU Jing-ru, ZHANG Liu-xin, WU Jian-feng, SUN Ling-yun. Approach to Classification of Eye Movement Directions Based on EEG Signal [J]. Computer Science, 2020, 47(4): 112-118.
[8] CHEN Tian-tian, YAO Huang, ZUO Ming-zhang, TIAN Yuan, YANG Meng-ting. Review of Dynamic Gesture Recognition Based on Depth Information [J]. Computer Science, 2018, 45(12): 42-51.
[9] LIU Zhe and LI Zhi. Research and Development of Computer-aided Requirements Engineering Tool Based on Multi-modal Interaction Technologies [J]. Computer Science, 2017, 44(4): 177-181.
[10] XIN Yi-zhong, MA Ying and YU Xia. Comparison of Direct and Indirect Pen Tilt Input Performance [J]. Computer Science, 2015, 42(9): 50-55.
[11] GAO Yan-fei, CHEN Jun-jie and QIANG Yan. Dynamic Scheduling Algorithm in Hadoop Platform [J]. Computer Science, 2015, 42(9): 45-49.
[12] WANG Yu, REN Fu-ji and QUAN Chang-qin. Review of Dialogue Management Methods in Spoken Dialogue System [J]. Computer Science, 2015, 42(6): 1-7.
[13] HE Zheng-hai and LI Zhi. Research and Development of Computer-aided Requirements Analysis Tool Based on Human-computer Interaction [J]. Computer Science, 2015, 42(12): 181-183.
[14] LIU Xin-chen,FU Hui-yuan and MA Hua-dong. Real-time Fingertip Tracking and Gesture Recognition Using RGB-D Camera [J]. Computer Science, 2014, 41(10): 50-52.
[15] GAO Zeng-gui,SUN Shou-qian,ZHANG Ke-jun,SHE Duo-chun and YANG Zhong-liang. Gait Data System and Joint Movement Recognition Model for Human-exoskeleton Interaction [J]. Computer Science, 2014, 41(10): 42-44.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!