| As one of the most effective assisted localization methods in the cruise and landing phases of UAVs,visual localization can provide absolute position information for UAVs in the case of GPS failure.However,the current visual navigation systems mostly aim at specific scenes to achieve visual positioning and navigation tasks under a single perspective,because visual imaging is affected by the complex environment and flight state.In view of this,in thesis,the visual localization system and its key technologies are studied from the conventional viewpoint of the UAV in both top and front view,and a modular and systematic design is carried out,which is divided into cruise positioning system and visual guided landing system according to the flight phases,as follows:(1)For UAV visual positioning and navigation in the looking down view during the cruise stage,a scene matching algorithm based on SURF_LATCH and a base image generation and sequence image motion estimation algorithm based on the keyframe strategy are proposed.Firstly,according to the adaptive problem of reference map in scene matching,the datum map is generated by dividing image windows and determining key frames to solve the coarse localization problem.Then,features and descriptions are extracted by SURF_LATCH to achieve image matching and precise localization.Finally,feature tracking is used to estimate the motion sequence images in the window.Experimental results show that the proposed method can achieve UAV localization quickly and accurately,and the speed is 1.5 times faster than that of SURF matching.Meanwhile,the positioning and navigation error is within 10 meters at a flight altitude of500 meters.(2)For UAV visual positioning and navigation in the forward looking view during the cruise state,a virtual perspective imaging system is designed and images from the corresponding viewpoint are generated for forward matching and positioning.Firstly,a3D plane scene model based on satellite map is construed under reasonable assumptions.Then the virtual viewpoint and camera parameters are introduced.Finally the images of corresponding viewpoints are generated and matched according to the camera imaging principle.The virtual perspective imaging system designed in this thesis can generate images under the corresponding viewpoint by defining the route and the positions.The reliability of the system is verified through similarity detection experiments with the real images.Meanwhile,the positioning accuracy meets the navigation requirement by matching the virtual viewpoint image with the real image in the real flight image data of2100 meters altitude.(3)For UAV visual positioning and navigation during the landing stage,a visual guidance landing system is constructed.The system is designed with front and back-end separation.Firstly,in the front-end design of the system,deep learning is used to identify the airport,and then the boundary of the airport runway is extracted.Finally,in the back-end of the system,the recognized sideline is constrained and optimized to reduce the positioning error through continuous observation of multiple frames.Experimental verification is performed by virtual landing data provided by the virtual view imaging system and real data of UAV landing,and the system performs well.In the simulation data of 3.3km flight distance,the system can converge quickly and the positioning accuracy is on the order of ten meters.in the real flight landing data of 5.5km flight distance,the system can identify the airport stably and the longitude and latitude errors gradually converge to the order of 10-4,and the positioning accuracy meets the requirements of the landing stage. |