Intelligent vehicle technology and energy consumption control technology are important measures to solve energy shortage,traffic congestion and frequent road accidents,which can improve the efficiency of vehicle traffic and reduce traffic congestion.Multi-sensor information fusion technology is a key challenge for intelligent vehicle environment perception.Therefore,it is an issue to use multi-sensor information fusion technology to synthesize the traffic status of urban roads and share information through a vehicle-road-cloud collaborative system,which must be considered in the research of energy-saving,safe and efficient intelligent transportation strategy.Multi-sensor fusion is the primary prerequisite for the realization of driverless vehicle,and multi-sensor joint calibration is the basis of multi-sensor data fusion.Therefore,this paper takes multi-sensor joint calibration as the research object,and conducts research on the extrinsic parameters calibration strategy between the Li DAR and Global Navigation Satellite System(GNSS)/Inertial Measurement Unit(IMU)of high-precision map and the extrinsic parameters calibration strategy between Li DAR and camera of object detection.The specific contents are as follows:Firstly,the imaging principle of conventional lens and fisheye lens is analyzed and their mathematical imaging models are established,and the measurement principle of Li DAR is described.Then,the coordinate transformation of multi-sensor spatial synchronization is analyzed.The mathematical formulas of three-dimensional rotation by Euler angle,Rodrigues’ rotation formula,quaternion,Lie group and Lie algebra are derived.Two methods of time synchronization are introduced,including hardware synchronization and software synchronization.The coordinate transformation relationship between millimeter wave radar and camera,as well as Li DAR and camera is deduced.And the projection relationship between 3D world coordinate system and pixel coordinate system is established.Secondly,combing the characteristics of the mine environment and the motion of vehicles,the joint calibration strategy of Li DAR-GNSS/IMU based on the calibration pattern is developed,and the transformation relationship between three-dimensional world points in Li DAR coordinate system and Universal Transverse Mercator Grid System(UTM)is established.The normal vector of ground point cloud is used to find initial value of extrinsic parameters.To avoid large pitch and roll motion of unmanned mine car,a scheme of calibrating rotation parameters by constant attitude and translation parameters by circular motion is proposed.Then the effectiveness of the proposed method is verified by theoretical error analysis and real vehicle verification.It is proved that the method can solve the failure of Li DAR odometry of unmanned mine car in an open field and keep a certain calibration accuracy at the same time.Thirdly,the calibration model of Li DAR-GNSS/IMU is constructed based on the hand-eye calibration method.After the point cloud distortion corrected by GNSS/IMU data,the Li DAR odometry is calculated by Le GO-LOAM,and the joint calibration algorithm is developed by Ceres optimization library and C++.The extrinsic parameters are calibrated by urban road data from real vehicles,and the point cloud map is constructed based on the optimal extrinsic parameters.The experiment shows that the joint calibration method of Li DAR-GNSS/IMU on the basis of hand-eye calibration has achieved a better result in the urban environment with multiple constraints.Finally,a universal joint calibration scheme of multiple Li DAR and multiple cameras is constructed.The methods of detecting and refining calibration pattern corners on point cloud and image are proposed,the correlation between 3D points and2 D points is established,and the minimum reprojection error function is constructed through point-to-point constraint.To verify the calibration accuracy,the optimized extrinsic parameters and internal parameters of camera are used to project the point cloud onto the image.The results show that the method proposed can calibrate the extrinsic parameters between Li DAR and the camera effectively. |