Font Size: a A A

3D Calibration Of Fish Eye Lens And Lidar Based On SIFT Algorithm

Posted on:2021-02-27Degree:MasterType:Thesis
Country:ChinaCandidate:Z Q LaiFull Text:PDF
GTID:2518306557999369Subject:Instrument Science and Technology
Abstract/Summary:PDF Full Text Request
For the three-dimensional(3D)reconstruction of large objects and indoor scenes,the depth sensor represented by lidar is more and more widely used,and the technical means of obtaining 3D information becomes mature as well.In order to get 3D information of color,it is necessary to match 3D information obtained by the depth sensor with the color image information obtained by the ordinary Charge-Coupled Device(CCD)camera.In this case,it is necessary to obtain the relative location and pose relationship between the two sensors in advance for the further data fusion and 3D scene reconstruction.At present,the research of pose calibration mainly focuses on the data fusion of RGB-Depth(RGB-D)sensor,which is the integration of depth sensor and common camera on the same equipment,while the research of pose calibration between lidar and fisheye camera is limited.However,due to the wide imaging angle of the fisheye camera,which has the characteristics of non-similar imaging and barrel distortion,the pose relationship matrix obtained by the existing calibration technology has a large error with the actual.Therefore,the key technology of data fusion between lidar and fisheye camera is studied in this thesis.Through data preprocessing,the problem of pose relationship between fisheye camera and lidar is transformed into the problem of SIFT algorithm to calibrate the rotation matrix of two sensor panorama.Specifically,the main work of the paper includes:Firstly,aiming at the distortion of the fisheye lens and the eccentricity of the lens,the internal parameters are calibrated,and the fisheye images collected by the equipment are spliced to get the cylinder expansion of the color panorama.Then we process the point cloud data,and expand the point cloud according to u,v value to get the panorama of depth map.Secondly,by analyzing the imaging model,the coordinates of the same point in the point cloud data and the color panorama can be obtained by matching the features of panoramas of the fisheye lens camera and the depth panoramas of the lidar.The feature points extracted based on the SIFT algorithm can be combined into a feature plane.The normal vector of the feature plane is used as a constraint to get the rotation and translation matrix.That matrix is used as the result of pose calibration to unify the coordinate system between the fisheye lens camera and the lidar..Finally,the rotation translation matrix obtained by calibration is used to process the data of fish eye lens and lidar sensor,and the 360 ° full scene fusion of point cloud and color data is completed for visual point cloud display,which verifies the accuracy of pose calibration between lidar and fish eye lens camera.It provides a feasible technical solution for the application of lidar and fisheye camera in the field of 3D reconstruction of large scenes.
Keywords/Search Tags:laser radar, fisheye lens, pose calibration, data fusion
PDF Full Text Request
Related items