计算机科学 ›› 2023, Vol. 50 ›› Issue (12): 185-191.doi: 10.11896/jsjkx.230300116

• 计算机图形学&多媒体 • 上一篇    下一篇

农业场景下移动机器人的双目视觉定位与地图构建方法

余涛1, 熊盛武1,2   

  1. 1 武汉理工大学计算机科学与人工智能学院 武汉 430070
    2 武汉理工大学三亚科教创新园 海南 三亚 572000
  • 收稿日期:2023-03-14 修回日期:2023-07-07 出版日期:2023-12-15 发布日期:2023-12-07
  • 通讯作者: 熊盛武(xiongsw@whut.edu.cn)
  • 作者简介:(tyu@whut.edu.cn)
  • 基金资助:
    国家自然科学基金(62176194,62101393);车联网重大项目(2020AAA001);三亚崖州湾科技城项目(SCKJ-JYRC-2022-76);武汉理工大学三亚科教创新园项目(2021KF0031);重庆市自然科学基金(cstc2021jcyj-msxmX1148)

Stereo Visual Localization and Mapping for Mobile Robot in Agricultural Environments

YU Tao1, XIONG Shengwu1,2   

  1. 1 School of Computer Science and Artificial Intelligence,Wuhan University of Technology,Wuhan 430070,China
    2 Sanya Science and Education Innovation Park of Wuhan University of Technology,Sanya,Hainan 572000,China
  • Received:2023-03-14 Revised:2023-07-07 Online:2023-12-15 Published:2023-12-07
  • About author:YU Tao,born in 1994,Ph.D candidate,is a member of China Computer Federation.His main research interests include visual SLAM and computer vision.
    XIONG Shengwu,born in 1967,Ph.D,professor,Ph.D supervisor,is a member of China Computer Federation.His main research interests include artificial intelligence,machine learning and evolutionary algorithm.
  • Supported by:
    National Natural Science Foundation of China(62176194,62101393),Major Project of IoV(2020AAA001),Project of Sanya Yazhou Bay Science and Technology City(SCKJ-JYRC-2022-76),Sanya Science and Education Innovation Park of Wuhan University of Technology(2021KF0031) and Natural Science Foundation of Chongqing,China(cstc2021jcyj-msxmX1148).

摘要: 视觉定位与地图构建是实现移动机器人自主导航的关键技术。针对农业场景下特征跟踪困难、场景规模大、运动不稳定引起系统精度和鲁棒性下降的问题,提出了一种适用于农业场景的双目视觉定位与地图构建方法。该方法首先利用静态立体匹配点来增加跟踪阶段地图点的数量和覆盖范围,从而增加了深度计算的准确率,同时提出一种点选择算法对密集地图点进行采样并移除离群点,进一步提高了系统的准确率和运行效率;然后通过显式尺度估计来减小大规模场景下定位与地图构建的尺度误差,并结合场景特点改进关键帧判别策略,避免了远处大目标导致关键帧稀疏的问题;最后提出新的运动假设构建位姿估计失败时的恢复策略,提高了系统在颠簸运动时的鲁棒性。在农业场景数据集上的评估结果表明,相比于当前先进的视觉定位与地图构建系统,提出的方法在困难序列上的轨迹误差降低幅度超过50%,其中3个序列上的尺度误差下降了一个数量级,取得了更高的精度和鲁棒性,能有效地应对农业场景下视觉定位与地图构建的挑战。

关键词: 农业场景, 视觉定位与地图构建, 直接法, 双目视觉, 尺度估计

Abstract: Visual-based localization and mapping is the key technology for autonomous robots.Visual localization and mapping in agricultural environments faces more challenges,including few distinguishable landmarks for tracking,large-scale scene,unstable movements.To address these problems,a stereo visual localization and mapping method is proposed.Static stereo matching points are used to increase the number andcoverage of map points,which improve the accuracy of depth calculation.A point selection method is proposed to further improve the accuracy and efficiency by sampling the dense map points and removing outliers.Then scale estimation is proposed to reduce the scale error of localization and mapping in large scale agricultural scenes.Keyframe criteria is adapted to avoid the impact of large far-away objects that could cause abnormal keyframe distribution.Finally,a new motion assumption is proposed to recover the system from failure tracking,which improves the system's robustness at the case of unstable movements.Experimental results show that the proposed method achieves better performance than other state-of-the-art vi-sual localization and mapping systems.By addressing the challenges individually,the proposed visual localization and mapping system is more accurate and robust in agricultural environments.

Key words: Agricultural environments, Visual localization and mapping, Direct method, Stereo vision, Scale estimation

中图分类号: 

  • TP391
[1]VOUGIOUKAS S G.Agricultural Robotics[J].Annual Review of Control,Robotics,and Autonomous Systems,2019,2(1):365-392.
[2]KLEIN G,MURRAY D.Parallel tracking and mapping for smallAR workspaces[C]//Proceedings of the IEEE/ACM International Symposium on Mixed and Augmented Reality.Nara:IEEE,2007:225-234.
[3]PIRE T,FISCHER T,CIVERA J,et al.Stereo parallel tracking and mapping for robot localization[C]//Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems.Hamburg:IEEE,2015:1373-1378.
[4]MUR-ARTAL R,TARDOS J D.ORB-SLAM2:An Open-Sou-rce SLAM System for Monocular,Stereo,and RGB-D Cameras[J].IEEE Transactions on Robotics,2017,33(5):1255-1262.
[5]CAMPOS C,ELVIRA R,RODRíGUEZ J J G,et al.ORB-SLAM3:An Accurate Open-Source Library for Visual,Visual-Inertial,and Multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6):1874-1890.
[6]QIN T,LI P,SHEN S.VINS-Mono:A Robust and Versatile Monocular Visual-Inertial State Estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020.
[7]NEWCOMBE R A,LOVEGROVE S J,DAVISON A J.DTAM:Dense tracking and mapping in real-time[C]//Proceedings of the International Conference on Computer Vision.Barcelona:IEEE,2011:2320-2327.
[8]FORSTER C,ZHANG Z,GASSNER M,et al.SVO:Semidirect Visual Odometry for Monocular and Multicamera Systems[J].IEEE Transactions on Robotics,2017,33(2):249-265.
[9]ENGEL J,KOLTUN V,CREMERS D.Direct Sparse Odometry[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(3):611-625.
[10]WANG R,SCHWÖRER M,CREMERS D.Stereo DSO:Large-Scale Direct Sparse Visual Odometry with Stereo Cameras[C]//Proceedings of the IEEE International Conference on Computer Vision.Venice:IEEE,2017:3923-3931.
[11]MO J,SATTAR J.Extending Monocular Visual Odometry to Stereo Camera Systems by Scale optimization[C]//Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).Macau:IEEE,2019:6921-6927.
[12]RUBLEE E,RABAUD V,KONOLIGE K,et al.ORB:An efficient alternative to SIFT or SURF[C]//Proceedings of the International Conference on Computer Vision.Barcelona:IEEE,2011:2564-2571.
[13]AUAT CHEEIN F,STEINER G,PEREZ PAINA G,et al.Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection[J].Computers and Electronics in Agriculture,2011,78(2):195-207.
[14]JUMAN M A,WONG Y W,RAJKUMAR R K,et al.A novel tree trunk detection method for oil-palm plantation navigation[J].Computers and Electronics in Agriculture,2016,128:172-180.
[15]MENDES J M,DOS SANTOS F N,FERRAZ N A,et al.Localization based on natural features detector for steep slope vineyards[J].Journal of Intelligent & Robotic Systems,2019,93(3/4):433-446.
[16]UNDERWOOD J,WENDEL A,SCHOFIELD B,et al.Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle[J].Journal of Field Robotics,2017,34(6):1061-1083.
[17]SA I,LEHNERT C,ENGLISH A,et al.Peduncle detection of sweet pepper for autonomous crop harvesting-combined color and 3-D information[J].IEEE Robotics and Automation Letters,2017,2(2):765-772.
[18]WINTERHALTER W,FLECKENSTEIN F V,DORNHEGEC,et al.Crop Row Detection on Tiny Plants With the Pattern Hough Transform[J].IEEE Robotics and Automation Letters,2018,3(4):3394-3401.
[19]IMPEROLI M,POTENA C,NARDI D,et al.An effective multi-cue positioning system for agricultural robotics[J].IEEE Robotics and Automation Letters,2018,3(4):3685-3692.
[20]DONG J,BURNHAM J G,BOOTS B,et al.4D crop monitoring:Spatio-temporal reconstruction for agriculture[C]//Proceedings of the IEEE International Conference on Robotics and Automation.Singapore:IEEE,2017:3878-3885.
[21]CHEBROLU N,LOTTES P,LŃÄBE T,et al.Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields[C]//Proceedings of the International Conference on Robotics and Automation.Montreal:IEEE,2019:1787-1793.
[22]POTENA C,KHANNA R,NIETO J,et al.AgriColMap:Aerial-Ground Collaborative 3D Mapping for Precision Farming[J].IEEE Robotics and Automation Letters,2019,4(2):1085-1092.
[23]PRETTO A,ARAVECCHIA S,BURGARD W,et al.Building an Aerial-Ground Robotics System for Precision Farming:An Adaptable Solution[J].IEEE Robotics & Automation Magazine,2021,28(3):29-49.
[24]PIRE T,MUJICA M,CIVERA J,et al.The Rosario dataset:Multisensor data for localization and mapping in agricultural environments[J].The International Journal of Robotics Research,2019,38(6):633-641.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!