计算机科学 ›› 2022, Vol. 49 ›› Issue (11A): 211200088-5.doi: 10.11896/jsjkx.211200088
吕润1,2, 李冠宇3, 亓霈3, 钱伟行3, 汪澜泽3, 冯太萍1,2
LYU Run1,2, LI Guan-yu3, QI Pei3, QIAN Wei-xing3, WANG Lan-ze3, FENG Tai-ping1,2
摘要: 针对惯性传感器精度低下影响基于激光雷达/惯性信息融合的同时定位与建图(Simultaneous Localization and Mapping,SLAM)技术性能的问题,提出了一种旋转捷联惯导系统辅助下的多线激光雷达SLAM优化方案。该方案探讨了基于模糊自适应卡尔曼滤波的旋转捷联惯导对准方法,在载体运动过程中完成载体姿态与惯性传感器误差的实时修正;在此基础上,将修正后的惯性传感器数据与激光雷达点云数据进行紧耦合模式下的信息融合,以提高载体在复杂场景中运动时定位与建图的精度和实时性。实验结果表明,基于旋转惯导与多线激光雷达信息融合的SLAM方案,在保证运算实时性的同时,有效提高了激光雷达/惯性里程计的定位性能,以及点云地图的准确性。
中图分类号:
[1]ANDERSO S,BARFOOT T D.RANSAC for motion-distorted 3D visual sensors[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Tokyo,2013:2093-2099. [2]BEHLEY J,STACHNISS C.Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments[C]//Robotics:Science and Systems.2018:1467-1479. [3]ZHANG J,SINGH S.LOAM:Lidar Odometry and Mapping in Real-time[C]//Robotics:Science and Systems.2014:761-775. [4]SHAN T,ENGLOT B.LeGO-LOAM:Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Madrid.2018:4758-4765. [5]HUAI Z,HUANG G.Robocentric Visual-Inertial Odometry[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems.Madrid,2018:6319-6326. [6]QIN T,LI P,SHEN S.Vins-mono:A robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020. [7]LEUTENEGGER S,LYNEN S,BOSSE M,et al.Keyframe-based visual-inertial odometry using nonlinear optimization[J].The International Journal of Robotics Research,2015,34(3):314-334. [8]HUANG G P,TRAWNY N,MOURIKIS A I,et al.Observability-based consistent EKF estimators for multi-robot cooperative localization[J].Autonomous Robots,2011,30(1):99-122. [9]PARK C,MOGHADAM P,KIM S,et al.Elastic LiDAR Fu-sion:Dense Map-Centric Continuous-Time SLAM[C]//IEEE International Conference on Robotics and Automation.Brisbane,2018:1206-1213. [10]GENEVA P,ECKENHOFF K,YANG Y,et al.LIPS:LiDAR-Inertial 3D Plane SLAM[C]//IEEE/RSJ International Confe-rence on Intelligent Robots and Systems.Madrid,2018:123-130. [11]FORSTER C,CARLONE L,DELLAERT F,et al.IMU preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation[C]//Robotics:Science and Systems.Sapienza Univ,2015:236-252. [12]YE H,CHEN Y,LIU M.Tightly Coupled 3D Lidar Inertial Odometry and Mapping[C]//International Conference on Robotics and Automation.Montreal:IEEE,2019:3144-3150. [13]HAO S Y,LU H,WEI X,et al.Reduced high-degree strong tracking cubature Kalman filter and its application in integrated navigation system[J].Control and Decision,2019,34(10):2105-2114. [14]QIAN W X.Research on High-Precision Initial Alignment ofStrapdown Inertial and Integrated Navigation System[D].Nanjing:Nanjing University of Aeronautics and Astronautics,2010. [15]WANG H P,CAI Y W,XIN C J,et al.Research on InitialAlignment Method Based on Rotation Modulation with Static Base[J].Journal of Ordnance Equipment Engineering,2020,41(7):128-132. [16]HU J,SHI X Z.Refined Alignment Method for Single-axis Rotary Inertial Navigation Based on Fuzzy Adaptive Filtering[J].Journal of System Simulation,2021,33(2):315-323. [17]XU Q J.Research on Calibration and Initial Alignment Techno-logy of Strapdown Inertial Navigation[D].Harbin:Harbin Engineering University,2014. [18]JING Z Y.Research on Error Compensation Technology of Rotary Inertial Navigation System Based on MEMS Device[D].Taiyuan:North University of China,2020. [19]LI W H.Lightweight Multisensor Integrated slam system Based on eskf and graph optimization [J].Scientific and Technological Innovation,2021(8):15-18. [20]SHAUKAT N,ALI A,JAVED IQBAL M,et al.Multi-SensorFusion for Underwater Vehicle Localization by Augmentation of RBF Neural Network and Error-State Kalman Filter[J].Sensors,2021,21(4):1149. [21]JIA X H,XU W F,LIU J Y,et al.Solving method of lidarodometry based on IMU[J].Chinese Journal of Scientific Instrument,2021,42(1):39-48. |
|