计算机科学 ›› 2025, Vol. 52 ›› Issue (1): 221-231.doi: 10.11896/jsjkx.240400108

• 计算机图形学&多媒体 • 上一篇    下一篇

基于双流融合网络的非接触式IR-UWB人体动作识别方法

张传宗, 王冬子, 郭政鑫, 桂林卿, 肖甫   

  1. 南京邮电大学计算机学院、软件学院、网络空间安全学院 南京 210023
    江苏省无线传感网高技术重点实验室 南京 210023
  • 收稿日期:2024-04-15 修回日期:2024-07-02 出版日期:2025-01-15 发布日期:2025-01-09
  • 通讯作者: 郭政鑫(guozx@njupt.edu.cn)
  • 作者简介:(zhangcz@njupt.edu.cn)
  • 基金资助:
    国家杰出青年科学基金(62125203);国家自然科学基金(61932013);江苏省重点研发计划重大科技示范项目(BE2022798);南京邮电大学校引进人才科研启动基金(NY223041)

Contact-free IR-UWB Human Motion Recognition Based on Dual-stream Fusion Network

ZHANG Chuanzong, WANG Dongzi, GUO Zhengxin, GUI Linqing, XIAO Fu   

  1. School of Computer Science,Nanjing University of Posts and Telecommunications,Nanjing 210023,China
    Jiangsu High Technology Research Key Laboratory for Wireless Sensor Networks,Nanjing 210023,China
  • Received:2024-04-15 Revised:2024-07-02 Online:2025-01-15 Published:2025-01-09
  • About author:ZHANG Chuanzong,born in 1997,postgraduate.His main research interests include IR-UWB and mobile computing.
    GUO Zhengxin,born in 1993,Ph.D,lecturer.His main research interests include wireless sensing,mobile computing,deep learning and Internet of Things.
  • Supported by:
    National Science Fund for Distinguished Young Scholars of China(62125203), National Natural Science Foundation of China(61932013),Key Research and Development Program of Jiangsu Province(BE2022798) and Natural Science Research Start-up Foundation of Recruiting Talents of Nanjing University of Posts and Telecommunications(NY223041).

摘要: 随着智能感知技术的飞速发展,人机交互(Human Computer Interaction,HCI)领域迎来了全新的发展态势。传统的人机交互方法主要依赖可穿戴设备或者摄像头采集用户的行为数据,虽然识别精准,却存在不小的局限性。具体而言,可穿戴设备会给用户带来额外的使用负担,而基于摄像头的方案不仅会受到环境光线的影响,还会涉及用户隐私的泄露,这些因素均限制了其在日常生活中的广泛应用。为了突破这些限制,实现精确的、非接触式人机交互应用,利用无线射频(Radio Frequency,RF)领域中脉冲超宽带(Impulse Radio Ultra-Wideband,IR-UWB)所具有的高灵敏度和精细空间分辨率等优势,提出了一种基于双流融合网络的非接触式人体动作识别方法。该方法捕获目标运动所导致的时域信号变化,并通过对时域特征进行多普勒频移变化,提取到对应的频域特征。在此基础上,构建了一个融合多维卷积神经网络(Convolutional Neural Networks,CNNs)和GoogLeNet模块的双流网络模型,以实现高精度的动作识别。通过广泛的实验测试,结果表明所提方法对8种常见人体动作的平均识别准确率达到94.89%,并且在不同的测试条件下均能保持超过90%的识别准确率,进一步验证了所提方法的鲁棒性。

关键词: 人机交互, 无线感知, 脉冲超宽带, 动作识别

Abstract: With the rapid development of intelligent sensing technology,the field of human computer interaction(HCI) has entered a new era.Traditional HCI methods,predominantly reliant on wearable devices and cameras to collect user behavior data,have significant limitations despite their precise recognition capabilities.Wearable devices,for instance,impose additional burden on users,whereas camera-based solutions are susceptible to ambient lighting conditions and pose significant privacy concerns.These challenges considerably restrict their applicability in daily life.To solve these challenges,we utilize the exceptional sensiti-vity and spatial resolution of impulse radio ultra-wideband(IR-UWB) in the field of radio frequency(RF) to propose a novel and contact-free method for human motion recognition based on a dual-stream fusion network.This method adeptly captures the temporal signal variations caused by target movements and extracts the corresponding frequency-domain features by analyzing Doppler frequency shift(DFS) changes on the time-domain signals.Subsequently,a sophisticated dual-stream network model,integrating multi-dimensional convolutional neural networks(CNNs) and GoogLeNet modules,is developed to facilitate precise action recognition.Through extensive experimental tests,the results show that the proposed method achieves an average accuracy of 94.89% for eight common daily human actions and maintains an accuracy of over 90% under varying test conditions,thereby va-lidating the robustness of the proposed method.

Key words: Human computer interaction, Wireless sensing, Impulse radio ultra-wideband(IR-UWB), Motion recognition

中图分类号: 

  • TP391
[1]YANG X,TIAN Y L.Effective 3d action recognition usingeigenjoints[J].Journal of Visual Communication and Image Representation,2014,25(1):2-11.
[2]XU Y,CHEN J,YANG Q,et al.Human posture recognition and fall detection using Kinect V2 camera[C]//2019 Chinese Control Conference(CCC).IEEE,2019:8488-8493.
[3]LI X,WANG Y,ZHANG B,et al.PSDRNN:An efficient and effective HAR scheme based on feature extraction and deep learning[J].IEEE Transactions on Industrial Informatics,2020,16(10):6703-6713.
[4]HE X,ZHU J,SU W,et al.RFID based non-contact human activity detection exploiting cross polarization[J].IEEE Access,2020,8:46585-46595.
[5]CASTILLO-CARA M,LOVÓN-MELGAREJO J,BRAVORO-CCA G,et al.An empirical study of the transmission power setting for bluetooth-based indoor localization mechanisms[J].Sensors,2017,17(6):1318.
[6]KJARGAARD M B,WIRZ M,ROGGEN D,et al.Mobile sen-sing of pedestrian flocks in indoor environments using wifi signals[C]//2012 IEEE International Conference on Pervasive Computing and Communications.IEEE,2012:95-102.
[7]WANG W,LIU A X,SHAHZAD M,et al.Device-free human activity recognition using commercial WiFi devices[J].IEEE Journal on Selected Areas in Communications,2017,35(5):1118-1131.
[8]YAN H,ZHANG Y,WANG Y,et al.WiAct:A passive WiFi-based human activity recognition system[J].IEEE Sensors Journal,2019,20(1):296-305.
[9]ZHOU Q,YANG Q,XING J.Enabling efficient WiFi-based occupant behavior recognition using insufficient samples[J].Buil-ding and Environment,2022,212:108806.
[10]XIAO C,LEI Y,LIU C,et al.Mean teacher-based cross-domain activity recognition using WiFi signals[J].IEEE Internet of Things Journal,2023,10(14):12787-12797.
[11]XIA Z,LIU J,GUO S.Rf-care:Rfid-based human pose estimation for nursing-care applications[C]//2021 IEEE International Conference on Robotics and Biomimetics(ROBIO).IEEE,2021:1384-1389.
[12]OGUNTALA G A,HU Y F,ALABDULLAH A A S,et al.Passive RFID module with LSTM recurrent neural network activity classification algorithm for ambient-assisted living[J].IEEE Internet of Things Journal,2021,8(13):10953-10962.
[13]LI L,BAI R,XIE B,et al.R&P:an low-cost device-free activity recognition for E-health[J].IEEE Access,2017,6:81-90.
[14] LI J,PHUNG S L,TIVIVE F H C,et al.Automatic classification of human motions using Doppler radar[C]//The 2012 International Joint Conference on Neural Networks(IJCNN).IEEE,2012:1-6.
[15]KIM Y,MOON T.Human detection and activity classification based on micro-Doppler signatures using deep convolutional neural networks[J].IEEE Geoscience and Remote Sensing Letters,2015,13(1):8-12.
[16]KIM Y,ALNUJAIM I,OH D.Human activity classificationbased on point clouds measured by millimeter wave MIMO radar with deep recurrent neural networks[J].IEEE Sensors Journal,2021,21(12):13522-13529.
[17]ABEDI H,ANSARIYAN A,MORITA P P,et al.AI-powered non-contact in-home gait monitoring and activity recognition system based on mm-wave FMCW radar and cloud computing[J].arXiv:2208.05905,2022.
[18]BRYAN J D,KWON J,LEE N,et al.Application of ultra-wide band radar for classification of human activities[J].IET Radar,Sonar & Navigation,2012,6(3):172-179.
[19]AHMED S,CHO S H.Hand gesture recognition using an IR-UWB radar with an inception module-based classifier[J].Sensors,2020,20(2):564.
[20]DING C,ZHANG L,GU C,et al.Non-contact human motionrecognition based on UWB radar[J].IEEE Journal on Emerging and Selected Topics in Circuits and Systems,2018,8(2):306-315.
[21]JIANG L B,LI C,CHE L.Using two-dimensional wavelet pac-ket decomposition for human action recognition ultra-wideband radar [J].Journal of electronic measurement and instrument,2018,32(8):69-75.
[23]JIANG J.Human posture recognition based on UWB and support Vector Machine [D].Beijing:Beijing University of Posts and Telecommunications,2015.
[24]AL-QIZWINI M,BARJASTEH I,AL-QASSAB H,et al.Deep learning algorithm for autonomous driving using googlenet[C]//2017 IEEE Intelligent Vehicles Symposium(IV).IEEE,2017:89-96.
[25]LEE J,ABDEL-ATY M,CHOI K,et al.Multi-level hot zoneidentification for pedestrian safety[J].Accident Analysis & Prevention,2015,76:64-73.
[26]MAO M,ZHANG R,ZHENG H,et al.Dual-stream network for visual recognition[J].Advances in Neural Information Proces-sing Systems,2021,34:25346-25358.
[27]KIM Y,TOOMAJIAN B.Hand gesture recognition using micro-Doppler signatures with convolutional neural network[J].IEEE Access,2016,4:7125-7130.
[28]SKARIA S,AL-HOURANI A.,LECH,M,et al.Hand-Gesture Recognition Using Two-Antenna Doppler Radar with Deep Convolutional Neural Networks[J].IEEE Sensors Journal,2019,19:3041-3048.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!