计算机科学 ›› 2023, Vol. 50 ›› Issue (12): 192-202.doi: 10.11896/jsjkx.221000188

• 计算机图形学&多媒体 • 上一篇    下一篇

基于立体相机和UWB融合的移动机器人跟随方法

付勇, 吴炜, 万泽青   

  1. 江南大学物联网工程学院 江苏 无锡 214122
  • 收稿日期:2022-10-23 修回日期:2023-03-19 出版日期:2023-12-15 发布日期:2023-12-07
  • 通讯作者: 吴炜(weiwu@jiangnan.edu.cn)
  • 作者简介:(772111651@qq.com)
  • 基金资助:
    江苏省自然科学基金(BK20201340)

Following Method of Mobile Robot Based on Fusion of Stereo Camera and UWB

FU Yong, WU Wei, WAN Zeqing   

  1. School of Internet of Things Engineering,Jiangnan University,Wuxi,Jiangsu 214122,China
  • Received:2022-10-23 Revised:2023-03-19 Online:2023-12-15 Published:2023-12-07
  • About author:FU Yong,born in 1996,postgraduate.His main research interests include robot systems and path planning.
    WU Wei,born in 1985,associate professor.His main research interests include distributed parameter systems,computational intelligence and robot systems.
  • Supported by:
    Natural Science Foundation of Jiangsu Province(BK20201340).

摘要: 文中研究了人机共融环境下的自主跟随机器人。特别地,针对机器人确定所需跟随目标以及目标丢失后的重识别,提出了一种稳定有效的方法,即先基于立体相机的图像和点云数据实现对行人的视觉跟踪与定位;然后引入超宽带(Ultra Wide Band,UWB)的定位信息确定目标行人,并利用滤波算法融合传感器的数据得到相机坐标系下的坐标信息,最后利用坐标变换转为机器人坐标系下的位置。又提出了改进的动态窗口算法(Modified Dynamic Window Algorithm,MDWA),并将其作为机器人的跟随控制方法。另外,为保证机器人跟随能够持续稳定进行,基于传感器数据,提出了包含跟随行为、恢复行为、过渡行为的行为决策模块,通过行为间的切换,使机器人在面对因转弯抑或环境光照条件的变化使得相机失效而导致目标丢失时也能够重新找回目标。实验结果表明,所提出的跟随系统在开机时能够自动确定所需跟随目标,在有静态障碍物的场景,抑或是视野内有其他非目标行人干扰的动态场景下,机器人均能实现良好的避障跟随。特别地,机器人在转弯场景或是光照条件变化的场景下,机器人均可自主寻回被跟随目标,而且在转弯场景中,机器人的跟随成功率可达81%。

关键词: 人机共融, 行人跟随, 动态窗口, 导航, 避障

Abstract: This paper studies the autonomous following random robots in a human-machine blending environment.Especially,a stable and effective method is presented for the robot to determine the desired following target and the recognition after the target is lost,that is,to achieve the visual tracking and positioning of pedestrians based on the image of stereo camera and point cloud data.Then,the location information of UWB is introduced to determine the target pedestrian,and a filter algorithm is used to fuse the sensor data to get the coordinate information under the camera coordinate system.Finally,the coordinate transformation is used to convert the location under the robot coordinate system.An improved dynamic window algorithm(MDWA) is also proposed to improve the following tasks performed by the robot.In addition,based on sensor data,a behaviour decision module including following behaviour,recovery behaviour and transition behaviour is proposed.Through the switching between behaviours,the robot can also retrieve the target when it is lost due to the turning of the target or the change of ambient lighting conditions which make the camera invalid.Experimental results show that the proposed following system can automatically determine the desired following target at starting up,and the robot can achieve good obstacle avoidance following in the scene with static obstacles or in the dynamic scene with other non-target pedestrian disturbances in the view.In particular,the robot can independently retrieve the following target in a turning scene or in a scene with varying lighting conditions,and the success rate of the robot in a turning scene is 81%.

Key words: Human-robot integration, Human following, Dynamic window, Navigation, Obstacle avoidance

中图分类号: 

  • TP242
[1]YAO H C,PENG J W,DAI H D,et al.A Compliant human following method for mobile robot based on an improved spring model[J].Robot,2021,43(6):684-693.
[2]SUN Y,LIU J T.RGB-D sensor based human comfortable following behavior for service robots in indoor environments[J].Robot,2019,41(6):823-833.
[3]GROSS H M,DEBES K,EINHORN E,et al.Mobile robotic rehabilitation assistnt for walking and orientation training of stroke patients:A report on working progress[C]//IEEE International Conference on Systems,Man,and Cybernetics.IEEE,2014:1880-1887.
[4]CHUNG W,KIM H,YOO Y,et al.The detection and following of human legs through inductive approaches for a mobile robot with a single laser range finder[J].IEEE Transactions on Industrial Electronics,2011,59(8):3156-3166.
[5]KOIDE K,MIURA J,MENEGATTI E.A portable three-di-mensional LiDAR-based system for long-term and wide-area people behavior measurement[J].International Journal of Advanced Robotic Systems,2019,16(2):1729881419841532-1-1729881419841532-16.
[6]SONG C,ZHAO J J,WANG K,et al.A survey of few shot learning based on intelligent perception[J].Acta Aeronauticaet Astronautica Sinica,2020,41(S1):723756.
[7]HONIG S S,ORON-GILAD T,ZAICHYK H,et al.Toward socially aware person-following robots[J].IEEE Transactions on Congitive and Developmental Systems,2018,10(4):936-954.
[8]BAO X C,RAGHAVENDER S,JOHN K T.Integrating stereo vision with a CNN tracker for a person-following robot[C]//International Conference on Computer Vision Systems.Shenzhen:2017:300-313.
[9]BAO X C,RAGHAVENDER S,JOHN K T.Person followingrobot using selected online ada-boosting with stereo camera[C]//2017 14th Conference on Computer and Robot Vision(CRV).Edmonton,AB:IEEE,2017:48-55.
[10]AYEY Y,THIHA K,MYINT PYU M M,et al.A Deep Neural Network Based Human Following Robot with Fuzzy Control[C]//2019 IEEE International Conference on Robotics and Biomimetics(ROBIO).Dali:IEEE,2019:720-725.
[11]DOISY G,JEVTIC A,LUCET E,et al.Adaptive person-following algorithm based on depth images and mapping[C]//Proceedings of the IROS Workshop on Robot Motion Planning.Vi-lamoura,Algarve,Portugal:IEEE,2012:7-12.
[12]NIKDEL P,SHRESTHA R,VAUGHAN R.The hands-freepush-cart:Autonomous following in front by predicting user trajectory around obstacles[C]//IEEE International Conference on Robotics and Automation.Brisbane.QLD:IEEE,2018:4548-4554.
[13]TASAKI R,SAKURAI H,TERASHIMA K.Moving target localization method using foot mounted acceleration sensor for autonomous following robot[C]//IEEE Conference on Control Technology and Applications.Maui,HI:IEEE,2017:827-833.
[14]PANG L,CAO Z Q,YU J Z.A pedestrian-aware collision-free following approach for mobile robots based on A* and TEB[J].Acta Aeronauticaet Astronautica Sinica,2021,42(4):524909.
[15]YOON Y,YOON H,KIM J.Depth assisted person following robots[C]//IEEE RO-MAN.Gyeongju:IEEE,2013:330-331.
[16]GU C.The research and design of an autonomous following robot[D].Nanjing:Southeast University,2016.
[17]YANG C,SONG K.Control Design for Robotic Human-Following and Obstacle Avoidance Using an RGB-D Camera[C]//2019 19th International Conference on Control,Automation and Systems(ICCAS).JEJU:IEEE,2019:934-939.
[18]JUNG E J,LEE J H,YI B J,et al.Development of a laser-range-finder-based human tracking and control algorithm for a marathoner service robot[J].IEEE/ASME transactions on mechatronics,2014,19(6):1963-1976.
[19]ZHANG Y,WANG C,WANG X,et al.Fairmot:On the fairness of detection and re-identification in multiple object tracking[J].International Journal of Computer Vision,2021,129(11):3069-3087.
[20]CEN M F,HUANG Y L,ZHONG X Y,et al.Real-time obstacle avoidance and person following based on adaptive window approach[C]//2019 IEEE International Conference on Mechatronics and Automation(ICMA).Tianjin:IEEE,2019:64-69.
[21]XU Y W,YAN W X,WU W.Improvement of LiDAR SLAM Front-end Algorithm Based on Local Mapping Similar Scenes[J].Robot,2022,44(2):176-185.
[22]CHIEN V D,HEUNGJU A,JONG W K,et al.Collision-Free Navigation in Human-Following Task Using a Cognitive Robo-tic System on Differential Drive Vehicles[J].IEEE 25 Transactions on Cognitive and Developmental Systems,2023,15(1):78-87.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!