Font Size: a A A

Research On Inertial Vision Fusion Positioning Method Based On FAST-KNN Point Feature Detection And Matching

Posted on:2023-10-06Degree:MasterType:Thesis
Country:ChinaCandidate:X J SunFull Text:PDF
GTID:2532306905485324Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
The most critical link in the autonomous driving and Internet of Vehicles system is to achieve high-precision autonomous navigation and positioning of the vehicle.The accuracy of the vehicle positioning information received by the central system of the top-level decision-making is closely related to the accuracy of the judgment made by the perception module.For the vehicle itself,there is no need to resort to external navigation equipment such as smart phones,which can ensure the safety of the driver.The inertial positioning system is used as the main system by various navigation and positioning platforms due to its high short-term accuracy and iterative acquisition of carrier motion information.However,due to the mutual constraints of its cost and performance,low-cost inertial devices can cope with high accuracy requirements.The system is incompetent,the cost of the vision camera sensor is low,and it can assist the carrier to capture multi-dimensional information in the environment,integrating visual information to assist positioning,effectively improving the overall localization precision of the system model.In this paper,the inertial navigation system is used as the main positioning system,and the vehicle autonomous positioning technology based on the integration of inertial measurement unit and binocular visual sensors is researched,and the inertial-vision fusion positioning method based on FAST-KNN algorithm is realized.In view of the problems in the implementation of the fusion system,this article mainly does the following work:First,the error parameter model of the IMU is deduced,the random error characteristics of the IMU are analyzed by Allan variance method,and the inertial navigation solution process is simulated and verified by actual measurement.Secondly,the imaging model and distortion model of the binocular camera sensor are deduced,and the binocular camera is calibrated to obtain the internal parameters of the system.Then,three image detection matching algorithms are compared and analyzed,and the FAST-KNN method based on point features is used for the problems of poor feature matching effect and easy mismatch in the single texture environment,and the matching effect generated by different thresholds is compared and analyzed,and the effectiveness of the feature detection matching method is verified.Finally,based on the feature detection and matching method,the visual information is solved,the binocular camera and the IMU are jointly calibrated to obtain the spatio-temporal synchronization parameters,the inertial visual fusion filtering model is derived,the inertial navigation and visual solution information are fused with the error state Kalman filter,and the public data set and the real scene test are used to confirm that the inertial visual fusion positioning method realized in this paper can effectively improve the overall positioning accuracy of the system,which has certain feasibility and practical value.
Keywords/Search Tags:fusion localization, feature match, stereo vision, Kalman filter
PDF Full Text Request
Related items