Computer Science ›› 2021, Vol. 48 ›› Issue (8): 328-333.doi: 10.11896/jsjkx.210300079

• Human-Machine Interaction • Previous Articles     Next Articles

Real-time LSTM-based Multi-dimensional Features Gesture Recognition

LIU Liang, PU Hao-yang   

  1. School of Cyber Science and Engineering,Sichuan University,Chengdu 610000,China
  • Received:2021-03-05 Revised:2021-05-31 Published:2021-08-10
  • About author:LIU Liang,born in 1982,Ph.D,is a member of China Computer Federation.His main research interests include vulnerability exploiting,malicious code analysis and hand gesture recognition.(liangzhai118@scu.edu.cn)PU Hao-yang,born in 2000,postgra-duate.His main research interests include vulnerability analysis and hand gesture recognition.
  • Supported by:
    Sichuan Science and Technology Program(2021YFG0159).

Abstract: Gesture recognition is widely used in the field of sensing.There are three kinds of gesture recognition methods based on computer vision,depth sensor and motion sensor.The recognition based on motion sensor has the advantages of less input data,high speed,and direct acquisition of hand 3D information,which has gradually become a research hotspot.Traditional gesture recognition based on motion sensor can be considered as a pattern recognition problem essentially and its accuracy depends heavily on feature data sets extracted from prior experience.Different from traditional pattern recognition methods,deep learning can greatly reduce the workload of artificial heuristic feature extraction.To solve the problem of traditional pattern recognition,this paper proposes a real-time multi-dimensional features recognition method based on Long Short-Term Memory(LSTM)and the performance of the method is verified by sufficient experiment.The method defines a gesture library consisting of five basic gestures and seven complex gestures at first.Based on the kinematic features of hand posture,the angle features and displacement features are extracted and then the frequency domain features of sensor data are extracted by short-time Fourier transform(SFTF).Then,three features are inputted into deep neural network LSTM for training,so the collected gestures are classified and recognized.At the same time,in order to verify the effectiveness of the proposed method,the gesture data of six volunteers are collected as the experimental data set by self-designed hand-held experience stick.The experimental results show that the accuracy of the recognition method proposed in this paper achieves 94.38% for basic and complex gestures,and the recognition accuracy is improved by nearly 2% compared with the traditional support vector machine,K-nearest neighbor method and fully connected neural network.

Key words: Gesture recognition, Human-computer interaction, Inertial sensor, Motion capture

CLC Number: 

  • TP391.41
[1]AKL A,FENG C,VALAEE S,et al.A Novel Accelerometer-Based Gesture Recognition System[J].IEEE Transactions on Signal Processing,2011,59(12):6197-6205.
[2]LU T.A motion control method of intelligent wheelchair based on hand gesture recognition[C]//2013 IEEE 8th Conference on Industrial Electronics and Applications.New York:IEEE,2013:957-962.
[3]ZHAO S Y,SHI J,DONG M,et al.Car control based on gesture recognition[J].Internal Combustion Engine and Accessories,2020(17):204-205.
[4]ZHOU J L,HE Y Z,ZHANG Y M,et al.Design and implementation of VR piano and gesture recognition based on Leapmotion[J].Electronic Testing,2020(21):5-9.
[5]WANG J S,CHUANG F C.An accelerometer-based digital pen with a trajectory recognition algorithm for handwritten digit and gesture recognition[J].IEEE Transactions on Industrial Electronics,2012,59(7):2998-3007.
[6]ZHANG X,CHEN X,LI Y,et al.A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors[J].IEEE Transactions on Systems,Man,and Cybernetics,2011,41(6):1064-1076.
[7]LU Z,CHEN X,LI Q,et al.A Hand Gesture RecognitionFramework and Wearable Gesture-Based Interaction Prototype for Mobile Devices[J].IEEE Transactions on Human-Machine Systems,2017,44(2):293-299.
[8]LIU Y H,SHENG L,ZHANG G L,et al.2D Human GestureTracking and Recognition by the Fusion of MEMS Inertial and Vision Sensors[J].IEEE Sensors Journal,2014,14(4):1160-1170.
[9]CHEN C,NASSER K,LIU K,et al.Fusion of Inertial andDepth Sensor Data for Robust Hand Gesture Recognition[J].IEEE Sensors Journal,2014,14(6):1898-1903.
[10]ZHAO N,YANG X D,ZHANG Z,et al.Circulating Nurse Assistant:Non-contact Body Centric Gesture Recognition towards Reducing Iatrogenic Contamination[J].IEEE Journal of Biomedical and Health Informatics,2020(25):2305-2316.
[11]OLIVER A,MAHESWARI N,SAMRAJ A,et al.Technology;New Findings from Helwan University Update Understanding of Technology (Smart Healthcare Solutions Using the Internet of Medical Things for Hand Gesture Recognition System)[J].Journal of Engineering,2020(99):1701-1706.
[12]CHEN L,LI Y F,LIU Z Y,et al.Gesture recognition in virtual reality interactive games [J].Technological Innovation and Application,2019(20):22-24.
[13]ZHU T,JIN G D,LU L B.Kinect Application Overview and Development Prospect[J].Modern Computer (Professional Edition),2013(6):8-11,33.
[14]CHEN L C,WANG F.A Survey on Hand Gesture Recognition[C]//2013 International Conference on Computer Sciences and Applications.Wuhan:IEEE,2013:313-316.
[15]MIAO Y W,LI J Y,LIU J Z,et al.Gesture recognition based on joint rotation feature and fingertip distance feature[J].Chinese Journal of Computers,2020,43(1):78-92.
[16]OSCAR D L,MIGUEL A L.A Survey on Human Activity Re-cognition using Wearable Sensors[J].IEEE Communications Surveys and Tutorials,2013,15(3):915-919.
[17]BENGIO Y.Deep learning of representations:Looking forward[C]//International Conference on Statistical Language and Speech Processing.Berlin:Springer,2013:1-37.
[18]YANG J B,NGUYEN M,LI L X,et al.Deep convolutional neural networks on multichannel time series for human activity re-cognition[C]//Twenty-Fourth International Joint Conference on Artificial Intelligence.2015:159-168.
[19]ALSHEIKH M A,SELIM A,LIN S,et al.Deep Activity Recognition Models with Triaxial Accelerometers[J].Computer Scien-ce,2015,1151:64-78.
[20]LANE N D,PETKO G,QENDRO L.DeepEar:robust smartphone audio sensing in unconstrained acoustic environments using deep learning[C]//Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing.New York:ACM,2015:283-294.
[21]THOMAS P,HAMMERLA N Y,OLIVIER P.Feature Lear-ning for Activity Recognition in Ubiquitous Computing[C]//Proceedings of the 22nd International Joint Conference on Artificial Intelligence(IJCAI 2011).Barcelona:DBLP,2011:215-226.
[22]VEPAKOMMA P,DE D,BHANSALI S,et al.A-Wristocracy:Deep Learning on Wrist-worn Sensing for Recognition of User Complex Activities[C]//IEEE Body Sensor Networks Confe-rence.New York:IEEE,2015:1-6.
[23]WALSE K H,DHARASKAR R V,THAKARE V M.PCABased Optimal ANN Classifiers for Human Activity Recognition Using Mobile Sensors Data[C]//International Conference on ICT for Intelligent Systems (ICTIS 2015).Cham:Springer,2016:429-436.
[24]HAMMERLAR N Y,HALLORAN S,PLOETZ T.Deep,Convo-lutional,and Recurrent Models for Human Activity Recognition using Wearables[J].Journal of Scientific Computing,2016,61(2):454-476.
[25]ALMASLUKH B,JALAL A.An effective deep autoencoder approach for online smartphone-based human activity recognition[C]//International Joint Conference on Computer Science.2017:160-165.
[26]WANG A,CHEN G,SHANG C,et al.Human Activity Recognition in a Smart Home Environment with Stacked Denoising Autoencoders[J].Springer,2016,9998:29-40.
[27]LI Y,SHI D,DING B,et al.Unsupervised Feature Learning for Human Activity Recognition Using Smartphone Sensors[C]//Mining Intelligence and Knowledge Exploration.2014,8891:99-107.
[28]XIONG Y,QUEK F.Hand Motion Gesture Frequency Properties and Multimodal Discourse Analysis[J].International Journal of Computer Vision,2006,69(3):353-371.
[1] LI Sun, CAO Feng. Analysis and Trend Research of End-to-End Framework Model of Intelligent Speech Technology [J]. Computer Science, 2022, 49(6A): 331-336.
[2] ZHANG Ji-kai, LI Qi, WANG Yue-ming, LYU Xiao-qi. Survey of 3D Gesture Tracking Algorithms Based on Monocular RGB Images [J]. Computer Science, 2022, 49(4): 174-187.
[3] WANG Chi, CHANG Jun. CSI Cross-domain Gesture Recognition Method Based on 3D Convolutional Neural Network [J]. Computer Science, 2021, 48(8): 322-327.
[4] RAN Meng-yuan, LIU Li, LI Yan-de, WANG Shan-shan. Deaf Sign Language Recognition Based on Inertial Sensor Fusion Control Algorithm [J]. Computer Science, 2021, 48(2): 231-237.
[5] LIU Xiao, YUAN Guan, ZHANG Yan-mei, YAN Qiu-yan, WANG Zhi-xiao. Hand Gesture Recognition Based on Self-adaptive Multi-classifiers Fusion [J]. Computer Science, 2020, 47(7): 103-110.
[6] JING Yu, QI Rui-hua, LIU Jian-xin, LIU Zhao-xia. Gesture Recognition Algorithm Based on Improved Multiscale Deep Convolutional Neural Network [J]. Computer Science, 2020, 47(6): 180-183.
[7] CHENG Shi-wei, CHEN Yi-jian, XU Jing-ru, ZHANG Liu-xin, WU Jian-feng, SUN Ling-yun. Approach to Classification of Eye Movement Directions Based on EEG Signal [J]. Computer Science, 2020, 47(4): 112-118.
[8] CHENG Shi-wei, QI Wen-jie. Dynamic Trajectory Based Implicit Calibration Method for Eye Tracking [J]. Computer Science, 2019, 46(8): 282-291.
[9] HAN Xiao, ZHANG Jing, LI Yue-long. Gesture Recognition Based on Hand Geometric Distribution Feature [J]. Computer Science, 2019, 46(6A): 246-249.
[10] LI Yu, CHAI Guo-zhong, LU Chun-fu, TANG Zhi-chuan. On-line sEMG Hand Gesture Recognition Based on Incremental Adaptive Learning [J]. Computer Science, 2019, 46(4): 274-279.
[11] LIU Jia-hui, WANG Yu-jie, LEI Yi. CSI Gesture Recognition Method Based on LSTM [J]. Computer Science, 2019, 46(11A): 283-288.
[12] SONG Yi-fan, ZHANG Peng, LIU Li-bo. Human-machine Interaction System with Vision-based Gesture Recognition [J]. Computer Science, 2019, 46(11A): 570-574.
[13] ZHANG Chao,JIN Long-bin,HAN Cheng. Method of Facial Motion Capture and Data Virtual Reusing Based on Clue Guided [J]. Computer Science, 2018, 45(6A): 198-201.
[14] CHEN Tian-tian, YAO Huang, ZUO Ming-zhang, TIAN Yuan, YANG Meng-ting. Review of Dynamic Gesture Recognition Based on Depth Information [J]. Computer Science, 2018, 45(12): 42-51.
[15] ZHOU Jing, CHEN Miao-hong and WU Hao-jie. Research on Plane Dead Reckoning Based on Inertial Navigation System [J]. Computer Science, 2017, 44(Z6): 582-586.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!