Computer Science ›› 2021, Vol. 48 ›› Issue (2): 231-237.doi: 10.11896/jsjkx.191200143

• Artificial Intelligence • Previous Articles     Next Articles

Deaf Sign Language Recognition Based on Inertial Sensor Fusion Control Algorithm

RAN Meng-yuan1, LIU Li1, LI Yan-de2, WANG Shan-shan1   

  1. 1 School of Big Data & Software Engineering,Chongqing University,Chongqing 400044,China
    2 School of Information Science & Engineering,Lanzhou University,Lanzhou 730000,China
  • Received:2019-12-24 Revised:2020-05-11 Online:2021-02-15 Published:2021-02-04
  • About author:RAN Meng-yuan,born in 1995,postgra-duate,is a student member of China Computer Federation.His main research interests include artificial intelligence and pattern recognition.
    LIU Li,born in 1981,Ph.D,associate professor,is a member of China Computer Federation.His main research interests include big data analysis,artificial intelligence and smart wearable technology.

Abstract: How to effectively communicate with the outside world for deaf people has always been a difficult issue that has attracted much attention.This paper proposes a sign language recognition scheme based on inertial sensor fusion control algorithm,which aims to achieve efficient and accurate real-time sign language recognition.This fusion control algorithm uses feedback control ideas to fuse two traditional attitude information calculation methods,reduces the impact of the environment on sensors,and can accurately obtain the attitude information of the measured object in the transient state.The algorithm performs data fusion,data preprocessing and feature extraction on the collected deaf-mute sign language data,and uses an adaptive model integration method composed of support vector machines(SVM),K-nearest neighbors(KNN) and feedforward neural networks(FNN) for classification.The results show that the proposed sensor fusion control algorithm effectively obtains real-time poses.The sign language recognition scheme achieves accurate recognition of 30 deaf-mute pinyin sign languages with a recognition accuracy of 96.5%.The work of this paper will lay a solid foundation for sign language recognition of deaf and dumb people,and provide refe-rences for related research on sensor fusion control.

Key words: Attitude and heading reference system, Deaf sign language recognition, Inertial sensor, Motion capture, Sensor fusion

CLC Number: 

  • TP399
[1] SHARMA V,KUMAR V,MASAGUPPI S C,et al.Virtual talk for deaf,mute,blind and normal humans[C]//2013 Texas Instruments India Educators' Conference.IEEE,2013:316-320.
[2] BHATNAGAR V S,MAGON R,SRIVASTAVA R,et al.Acost effective Sign Language to voice emulation system[C]//2015 Eighth International Conference on Contemporary Computing (IC3).IEEE,2015:521-525.
[3] SRIRAM N,NITHIYANANDHAM M.A hand gesture recognition based communication system for silent speakers[C]//2013 International Conference on Human Computer Interactions (ICHCI).IEEE,2013:1-5.
[4] PHI L T,NGUYEN H D,BUI T T Q,et al.A glove-based gesture recognition system for Vietnamese sign language[C]//2015 15th International Conference on Control,Automation and Systems (ICCAS).IEEE,2015:1555-1559.
[5] HOQUE M T,RIFAT-UT-TAUWAB M,KABIR M F,et al.Automated Bangla sign language translation system:Prospects,limitations and applications[C]//2016 5th International Confe-rence on Informatics,Electronics and Vision (ICIEV).IEEE,2016:856-862.
[6] ELHAYEK H,NACOUZI J,KASSEM A,et al.Sign to letter translator system using a hand glove[C]//The Third International Conference on e-Technologies and Networks for Development (ICeND2014).IEEE,2014:146-150.
[7] BEDREGAL B R C,DIMURO G P.Interval fuzzy rule-basedhand gesture recognition[C]//12th GAMM-IMACS International Symposium on Scientific Computing,Computer Arithmetic andValidated Numerics (SCAN 2006).IEEE,2006:12.
[8] BORGHETTI M,SARDINI E,SERPELLONI M.Sensorizedglove for measuring hand finger flexion for rehabilitation purposes[J].IEEE Transactions on Instrumentation and Measurement,2013,62(12):3308-3314.
[9] PŁAWIAK P,SONICKI T,NIEDWIECKI M,et al.Hand body language gesture recognition based on signals from specialized glove and machine learning algorithms[J].IEEE Transactions on Industrial Informatics,2016,12(3):1104-1113.
[10] TANYAWIWAT N,THIEMJARUS S.Design of an assistive communication glove using combined sensory channels[C]//2012 Ninth International Conference on Wearable and Implan-table Body Sensor Networks.IEEE,2012:34-39.
[11] VUTINUNTAKASAME S,JAIJONGRAK V,THIEMJARUS S.An assistive body sensor network glove for speech-and hea-ring-impaired disabilities[C]//2011 International Conference on Body Sensor Networks.IEEE,2011:7-12.
[12] KANWAL K,ABDULLAH S,AHMED Y B,et al.AssistiveGlove for Pakistani Sign Language Translation[C]//17th IEEE International Multi Topic Conference 2014.IEEE,2014:173-176.
[13] FU Y F,HO C S.Static finger language recognition for handicapped aphasiacs[C]//Second International Conference on Innovative Computing,Informatio and Control (ICICIC 2007).IEEE,2007:299.
[14] ARIF A,RIZVI S T H,JAWAID I,et al.Techno-Talk:An American Sign Language (ASL) Translator[C]//2016 International Conference on Control,Decision and Information Techno-logies(CoDIT).IEEE,2016:665-670.
[15] WU X,MAO X,CHEN L,et al.Trajectory-based view-invariant hand gesture recognition by fusing shape and orientation[J].IET Computer Vision,2015,9(6):797-805.
[16] WANG J S,CHUANG F C.An accelerometer-based digital pen with a trajectory recognition algorithm for handwritten digit and gesture recognition[J].IEEE Transactions on Industrial Electronics,2011,59(7):2998-3007.
[17] ZHANG X,CHEN X,LI Y,et al.A framework for hand gesture recognition based on accelerometer and EMG sensors[J].IEEE Transactions on Systems,Man,and Cybernetics-Part A:Systems and Humans,2011,41(6):1064-1076.
[18] LU Z,CHEN X,LI Q,et al.A hand gesture recognition framework and wearable gesture-based interaction prototype for mobile devices[J].IEEE transactions on human-machine systems,2014,44(2):293-299.
[19] ZHOU S,FEI F,ZHANG G,et al.2D human gesture tracking and recognition by the fusion of MEMS inertial and vision sensors[J].IEEE Sensors Journal,2013,14(4):1160-1170.
[20] LIU K,CHEN C,JAFARI R,et al.Fusion of inertial and depth sensor data for robust hand gesture recognition[J].IEEE Sensors Journal,2014,14(6):1898-1903.
[21] LU T.A motion control method of intelligent wheelchair based on hand gesture recognition[C]//2013 IEEE 8th Conference on Industrial Electronics and Applications (ICIEA).IEEE,2013:957-962.
[22] ZHU C,SHENG W.Wearable sensor-based hand gesture anddaily activity recognition for robot-assisted living[J].IEEE Transactions on Systems,Man,and Cybernetics-Part A:Systems and Humans,2011,41(3):569-573.
[23] HSU Y L,CHU C L,TSAI Y J,et al.An inertial pen with dynamic time warping recognizer for handwriting and gesture recognition[J].IEEE Sensors Journal,2014,15(1):154-163.
[24] HUSSAIN S M A,RASHID A B M H.User independent hand gesture recognition by accelerated dtw[C]//2012 International Conference on Informatics,Electronics & Vision (ICIEV).IEEE,2012:1033-1037.
[25] XU R,ZHOU S,LI W J.MEMS accelerometer based nonspecific-user hand gesture recognition[J].IEEE Sensors Journal,2011,12(5):1166-1173.
[26] ABID M R,PETRIU E M,AMJADIAN E.Dynamic sign language recognition for smart home interactive application using stochastic linear formal grammar[J].IEEE Transactions on Instrumentation and Measurement,2014,64(3):596-605.
[27] AKL A,FENG C,VALAEE S.A novel accelerometer-basedgesture recognition system[J].IEEE Transactions on Signal Processing,2011,59(12):6197-6205.
[28] ZHANG Z Q.Two-step calibration methods for miniature inertial and magnetic sensor units[J].IEEETransactions on Industrial Electronics,2014,62(6):3714-3723.
[29] CHEN C H,WU J C,CHEN J H.Prediction of flutter derivatives by artificial neural networks[J].Journal of wind enginee-ring and industrial aerodynamics,2008,96(10/11):1925-1937.
[30] ZHANG L,WANG H,LIANG J,et al.Decision support in can-cer base on fuzzy adaptive PSO for feedforward neural network training[C]//2008 International Symposium on Computer Science and Computational Technology.IEEE,2008:220-223.
[1] LIU Liang, PU Hao-yang. Real-time LSTM-based Multi-dimensional Features Gesture Recognition [J]. Computer Science, 2021, 48(8): 328-333.
[2] ZHANG Chao,JIN Long-bin,HAN Cheng. Method of Facial Motion Capture and Data Virtual Reusing Based on Clue Guided [J]. Computer Science, 2018, 45(6A): 198-201.
[3] WU Wen-hua, SONG Ya-fei, LIU Jing. Dynamic Reliability Evaluation Method of Evidence Based on Intuitionistic Fuzzy Sets and Its Applications [J]. Computer Science, 2018, 45(12): 160-165.
[4] ZHOU Jing, CHEN Miao-hong and WU Hao-jie. Research on Plane Dead Reckoning Based on Inertial Navigation System [J]. Computer Science, 2017, 44(Z6): 582-586.
[5] PENG Shu-juan,LIU Xin,CUI Zhen and ZHENG Guang. Segmented Low Rank Approximation Approach for Motion Capture Data Denoising [J]. Computer Science, 2013, 40(9): 307-311.
[6] . Research on Remote Health Monitoring System Based on Motion Sensor [J]. Computer Science, 2011, 38(11): 245-247.
[7] . [J]. Computer Science, 2008, 35(3): 84-86.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!