Font Size: a A A

Research On Lidar-camera Fusion SLAM Algorithm For Autonomous Driving Systems

Posted on:2022-07-27Degree:MasterType:Thesis
Country:ChinaCandidate:C G ZhangFull Text:PDF
GTID:2492306563475214Subject:Mechanical and electrical engineering
Abstract/Summary:PDF Full Text Request
With the development of computer technology,the performance of each module of the automatic driving system has been improved accordingly.Among them,the positioning module is the most basic component,and its accuracy and stability play a vital role in improving the safety of the automatic driving system.Real-time localization and mapping(SLAM)has been increasingly used in autonomous driving localization modules in recent years,where commonly used sensors are inertial navigation combined with lidar or monocular camera.Aiming at the shortcomings of the laser depending on the geometric features of the scene and the vision being easily affected by illumination,a fusion SLAM algorithm of laser vision was designed in this paper,which improved the overall performance of the localization algorithm by integrating the advantages and disadvantages of each complementary laser vision.Aiming at the shortcoming that SLAM using a single data source of vision or laser cannot run stablely in changing scenarios,this paper carries out feature extraction and matching of laser and vision data respectively in the front part of the algorithm,and proposes to use radar point cloud to obtain the depth of ORB feature points to obtain SPACE-ORB.In order to reduce the dependence of the algorithm on any data source of laser or vision,the algorithm inputs the pose estimation module together with the radar feature points in a loose-coupling way.In this paper,Lie algebra is introduced to simplify the solution process and reduce the calculation burden for the self-constraint problem in the optimization solution of pose matrix.In addition,to solve the problem of excessive abnormal data for SLAM,this paper proposes a Cauchy-Huber kernel function to improve the generalization of the pose estimation module to the outliers,which improves the robustness of the algorithm as a whole.In order to solve the problem of excessive computation caused by large-scale optimization of pose and feature points,a balanced selection strategy based on keyframe and sliding window is proposed in the back end of the algorithm,and a classification optimization strategy is proposed for Lie algebra of feature points and pose to reduce the computation amount of the algorithm.Aiming at the problem that the detection rate of laser point cloud loop loop detection module is too high,this paper adopts the visual word bag model based on ORB feature points to improve the detection mechanism to improve the detection rate.In terms of map construction,since the point cloud stitching at the key frame has a great impact on the overall accuracy of the map,this paper takes the visual pixel as a constant to participate in the radar point cloud optimization at the key frame,that is,the point cloud stitching in part of the range,so as to improve the stitching accuracy and improve the quality of map construction.In order to verify the effectiveness of the algorithm,this paper carried out experimental tests on the KITTI dataset and campus scenes.The KITTI dataset scenes included diverse environments such as towns,suburbs and highways,and the evaluation indexes were mainly ATE,RTE,APE and RPE STD.RMSE and Mean as well as the average CPU and memory usage when the algorithm is running.The results show that the average positioning relative error of the proposed algorithm is 0.11 m and 0.002 rad,which is more accurate and robust than the existing methods using single radar(A-LOAM)or camera(ORB-SLAM2),and can be effectively applied to the vehicle positioning of autonomous driving.In addition,this paper carried out field experiments in the campus environment,and successfully realized the 3D reconstruction of large and small scenes as well as the path positioning of vehicles in the reconstruction scene.
Keywords/Search Tags:Simultaneous localization and mapping (SLAM), Sensor fusion, Nonlinear optimization, Autonomous vehicle positioning
PDF Full Text Request
Related items