Font Size: a A A

Research On Localization And Mapping Of Mobile Robot Based On Vision/Laser

Posted on:2020-08-29Degree:MasterType:Thesis
Country:ChinaCandidate:L KangFull Text:PDF
GTID:2428330626450479Subject:Instrumentation engineering
Abstract/Summary:PDF Full Text Request
With the development of information industry,mobile robot is becoming more and more intelligent and autonomous,and it is widely used in various fields.Real-time location and mapping(SLAM)has always been a hot and difficult issue in the field of robot research.Vision and laser mapping are two of the most commonly used methods.In this paper,the mobile robot platform is loaded with wheeled robot.The fusion research of vision and laser information is carried out by using the platform to improve the accuracy of positioning and mapping of mobile robot,and to solve the navigation problem of mobile robot.In order to realize the fusion process better,the main contents of this paper are as follows:(1)The 3D vision SLAM.based on RGB-D sensor is studied.In the feature extraction and matching algorithm,the ORB algorithm,which has higher real-time performance,is compared with the SIFT,SURF algorithm,and the efficiency of the ORB algorithm is proved.In order to solve the problems of low precision and low robustness of 3D point cloud automatic registration,a point cloud automatic registration algorithm based on improved ICP algorithm is proposed.The experiment results show that the improved scheme has stronger robustness,faster convergence speed and better registration results,which can meet the point cloud registration requirements.At the same time,the small-looking binocular camera is used to map and locate the surrounding environment,and good results are obtained.(2)A 2D lidar SLAM method based on improved particle filter is proposed,which aims at the problem of large computational complexity and complexity of the traditional particle filter algorithm.In order to solve the problem of particle consumption caused by resampling,the odometer information of robot and the distance information collected by laser sensor are fused to optimize the distribution function of the proposal,and the adaptive resampling mechanism is introduced to solve the problem of particle consumption caused by resampling.In order to verify the performance of the improved algorithm,the verification experiments are carried out on the mobile robot platform.The results show that the improved RBPF-SLAM method can construct the grid map in real time,and the efficiency and accuracy of the improved algorithm are obviously improved.(3)The fusion method of vision and laser SLAM is studied.In view of the advantages and disadvantages of binocular camera and two-dimensional laser,Kalman filter is proposed for loose coupling fusion of position and pose.The fusion algorithm can satisfy the improvement of precision through the analysis of indoor hall and corridor two scene experiments.Finally,the fusion mapping of vision and laser as well as autonomous navigation experiments are carried out.The experiment shows that in complex environments the map after the fusion of visual information is closer to the real environment than the map constructed by pure laser,and can reflect the obstacles in the experimental scene.
Keywords/Search Tags:Mobile robot, 2D lidar, Binocular camera, SLAM, Kalman filter
PDF Full Text Request
Related items