Computer Science ›› 2023, Vol. 50 ›› Issue (12): 185-191.doi: 10.11896/jsjkx.230300116

• Computer Graphics & Multimedia • Previous Articles     Next Articles

Stereo Visual Localization and Mapping for Mobile Robot in Agricultural Environments

YU Tao1, XIONG Shengwu1,2   

  1. 1 School of Computer Science and Artificial Intelligence,Wuhan University of Technology,Wuhan 430070,China
    2 Sanya Science and Education Innovation Park of Wuhan University of Technology,Sanya,Hainan 572000,China
  • Received:2023-03-14 Revised:2023-07-07 Online:2023-12-15 Published:2023-12-07
  • About author:YU Tao,born in 1994,Ph.D candidate,is a member of China Computer Federation.His main research interests include visual SLAM and computer vision.
    XIONG Shengwu,born in 1967,Ph.D,professor,Ph.D supervisor,is a member of China Computer Federation.His main research interests include artificial intelligence,machine learning and evolutionary algorithm.
  • Supported by:
    National Natural Science Foundation of China(62176194,62101393),Major Project of IoV(2020AAA001),Project of Sanya Yazhou Bay Science and Technology City(SCKJ-JYRC-2022-76),Sanya Science and Education Innovation Park of Wuhan University of Technology(2021KF0031) and Natural Science Foundation of Chongqing,China(cstc2021jcyj-msxmX1148).

Abstract: Visual-based localization and mapping is the key technology for autonomous robots.Visual localization and mapping in agricultural environments faces more challenges,including few distinguishable landmarks for tracking,large-scale scene,unstable movements.To address these problems,a stereo visual localization and mapping method is proposed.Static stereo matching points are used to increase the number andcoverage of map points,which improve the accuracy of depth calculation.A point selection method is proposed to further improve the accuracy and efficiency by sampling the dense map points and removing outliers.Then scale estimation is proposed to reduce the scale error of localization and mapping in large scale agricultural scenes.Keyframe criteria is adapted to avoid the impact of large far-away objects that could cause abnormal keyframe distribution.Finally,a new motion assumption is proposed to recover the system from failure tracking,which improves the system's robustness at the case of unstable movements.Experimental results show that the proposed method achieves better performance than other state-of-the-art vi-sual localization and mapping systems.By addressing the challenges individually,the proposed visual localization and mapping system is more accurate and robust in agricultural environments.

Key words: Agricultural environments, Visual localization and mapping, Direct method, Stereo vision, Scale estimation

CLC Number: 

  • TP391
[1]VOUGIOUKAS S G.Agricultural Robotics[J].Annual Review of Control,Robotics,and Autonomous Systems,2019,2(1):365-392.
[2]KLEIN G,MURRAY D.Parallel tracking and mapping for smallAR workspaces[C]//Proceedings of the IEEE/ACM International Symposium on Mixed and Augmented Reality.Nara:IEEE,2007:225-234.
[3]PIRE T,FISCHER T,CIVERA J,et al.Stereo parallel tracking and mapping for robot localization[C]//Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems.Hamburg:IEEE,2015:1373-1378.
[4]MUR-ARTAL R,TARDOS J D.ORB-SLAM2:An Open-Sou-rce SLAM System for Monocular,Stereo,and RGB-D Cameras[J].IEEE Transactions on Robotics,2017,33(5):1255-1262.
[5]CAMPOS C,ELVIRA R,RODRíGUEZ J J G,et al.ORB-SLAM3:An Accurate Open-Source Library for Visual,Visual-Inertial,and Multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6):1874-1890.
[6]QIN T,LI P,SHEN S.VINS-Mono:A Robust and Versatile Monocular Visual-Inertial State Estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020.
[7]NEWCOMBE R A,LOVEGROVE S J,DAVISON A J.DTAM:Dense tracking and mapping in real-time[C]//Proceedings of the International Conference on Computer Vision.Barcelona:IEEE,2011:2320-2327.
[8]FORSTER C,ZHANG Z,GASSNER M,et al.SVO:Semidirect Visual Odometry for Monocular and Multicamera Systems[J].IEEE Transactions on Robotics,2017,33(2):249-265.
[9]ENGEL J,KOLTUN V,CREMERS D.Direct Sparse Odometry[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2018,40(3):611-625.
[10]WANG R,SCHWÖRER M,CREMERS D.Stereo DSO:Large-Scale Direct Sparse Visual Odometry with Stereo Cameras[C]//Proceedings of the IEEE International Conference on Computer Vision.Venice:IEEE,2017:3923-3931.
[11]MO J,SATTAR J.Extending Monocular Visual Odometry to Stereo Camera Systems by Scale optimization[C]//Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).Macau:IEEE,2019:6921-6927.
[12]RUBLEE E,RABAUD V,KONOLIGE K,et al.ORB:An efficient alternative to SIFT or SURF[C]//Proceedings of the International Conference on Computer Vision.Barcelona:IEEE,2011:2564-2571.
[13]AUAT CHEEIN F,STEINER G,PEREZ PAINA G,et al.Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection[J].Computers and Electronics in Agriculture,2011,78(2):195-207.
[14]JUMAN M A,WONG Y W,RAJKUMAR R K,et al.A novel tree trunk detection method for oil-palm plantation navigation[J].Computers and Electronics in Agriculture,2016,128:172-180.
[15]MENDES J M,DOS SANTOS F N,FERRAZ N A,et al.Localization based on natural features detector for steep slope vineyards[J].Journal of Intelligent & Robotic Systems,2019,93(3/4):433-446.
[16]UNDERWOOD J,WENDEL A,SCHOFIELD B,et al.Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle[J].Journal of Field Robotics,2017,34(6):1061-1083.
[17]SA I,LEHNERT C,ENGLISH A,et al.Peduncle detection of sweet pepper for autonomous crop harvesting-combined color and 3-D information[J].IEEE Robotics and Automation Letters,2017,2(2):765-772.
[18]WINTERHALTER W,FLECKENSTEIN F V,DORNHEGEC,et al.Crop Row Detection on Tiny Plants With the Pattern Hough Transform[J].IEEE Robotics and Automation Letters,2018,3(4):3394-3401.
[19]IMPEROLI M,POTENA C,NARDI D,et al.An effective multi-cue positioning system for agricultural robotics[J].IEEE Robotics and Automation Letters,2018,3(4):3685-3692.
[20]DONG J,BURNHAM J G,BOOTS B,et al.4D crop monitoring:Spatio-temporal reconstruction for agriculture[C]//Proceedings of the IEEE International Conference on Robotics and Automation.Singapore:IEEE,2017:3878-3885.
[21]CHEBROLU N,LOTTES P,LŃÄBE T,et al.Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields[C]//Proceedings of the International Conference on Robotics and Automation.Montreal:IEEE,2019:1787-1793.
[22]POTENA C,KHANNA R,NIETO J,et al.AgriColMap:Aerial-Ground Collaborative 3D Mapping for Precision Farming[J].IEEE Robotics and Automation Letters,2019,4(2):1085-1092.
[23]PRETTO A,ARAVECCHIA S,BURGARD W,et al.Building an Aerial-Ground Robotics System for Precision Farming:An Adaptable Solution[J].IEEE Robotics & Automation Magazine,2021,28(3):29-49.
[24]PIRE T,MUJICA M,CIVERA J,et al.The Rosario dataset:Multisensor data for localization and mapping in agricultural environments[J].The International Journal of Robotics Research,2019,38(6):633-641.
[1] JIN Hai-yan, PENG Jing, ZHOU Ting, XIAO Zhao-lin. Binocular Image Segmentation Based on Graph Cuts Multi-feature Selection [J]. Computer Science, 2021, 48(8): 150-156.
[2] CHEN Yuan, HUI Yan, HU Xiu-hua. Background-aware Correlation Filter Tracking Algorithm with Adaptive Scaling and Learning Rate Adjustment [J]. Computer Science, 2021, 48(5): 177-183.
[3] ZHAO Qin-yan, LI Zong-min, LIU Yu-jie, LI Hua. Cascaded Siamese Network Visual Tracking Based on Information Entropy [J]. Computer Science, 2020, 47(9): 157-162.
[4] LI Jian-peng, SHANG Zhen-hong, LIU Hui. Visual Object Tracking Algorithm Based on Correlation Filters with Hierarchical Convolutional Features [J]. Computer Science, 2019, 46(7): 252-257.
[5] ZHU Shi-xin, YANG Ze-min. Line Tracking and Matching Algorithm Based on Semi-direct Method in Image Sequence [J]. Computer Science, 2019, 46(6A): 270-273.
[6] LI Juan, ZHOU Fu-qiang, LI Zuo-xin, LI Xiao-jie. Intelligent Geometry Size Measurement System for Logistics Industry [J]. Computer Science, 2018, 45(8): 218-222.
[7] ZHOU Shi-hao and ZHANG Yun. Log-polar Feature Guided Iterative Closest Point Algorithm [J]. Computer Science, 2018, 45(1): 297-306.
[8] ZHU Hang-jiang, ZHU Fan, PAN Zhen-fu and ZHU Yong-li. Visual Object Tracking Method with Motion Estimation and Scale Estimation [J]. Computer Science, 2017, 44(Z11): 193-198.
[9] HUANG Ye, HUANG Jing, XIAO Chang-shi, JIANG Wen and SUN Yi. Ship Trajectory Tracking Based on Binocular Vision [J]. Computer Science, 2017, 44(1): 308-313.
[10] ZHANG Run-dong and ZHANG Feng-yuan. Kernelized Correlation Filters Tracking with Scale Compensation [J]. Computer Science, 2016, 43(Z11): 201-204.
[11] GUO Wei-qing, TANG Yi-ping, LU Shao-hui and CHEN Qi. Review of 3D Stereo Vision Measure and Reconstruction Based on Mirror Image [J]. Computer Science, 2016, 43(9): 1-10.
[12] ZHAO Xia, YUAN Jia-zheng and LIU Hong-zhe. Advances in Vision-based Target Location Technology [J]. Computer Science, 2016, 43(6): 10-16.
[13] REN Yong-chang,XING Tao,LIU Da-cheng. Software Development Cost Estimates Based on Fuzzy Theory [J]. Computer Science, 2010, 37(10): 130-134.
[14] . [J]. Computer Science, 2007, 34(7): 225-228.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!