| This thesis proposes solutions for scene perception,understanding,and visualization in simulation mode and application mode for the problems of limited visibility and perceived degradation of UAVs in degraded visual environments.Then through the research of multi-source heterogeneous sensor fusion technology to achieve dynamic perception of the scene,through the research of obstacle highlighting and target detection technology to achieve the understanding of the scene,through the combination of 3D rendering engine technology to achieve the visualization process.Finally,it is supplemented by a variety of key technologies to achieve the unimpeded execution of flight missions in all-weather,all-weather,and all-terrain conditions.The main work completed in this thesis is as follows:1.The experimental environment needed for the research was established using 3D visual simulation modeling technology.Firstly,based on the complex scenes that may be faced during the flight of the drone,a series of terrain data and feature data with complex environmental characteristics were established.Then designed the degraded visual environment of rain,snow,fog,sand,dust,snow,day and night characteristics for all terrain conditions,and used shader rendering technology to simulate the imaging characteristics of forward-looking infrared,lidar and millimeter wave radar.Achieve multi-sensor perception of degraded visual scenes.Finally,the use of virtual instrument modeling software completes the creation of virtual flight instruments and semi-parametric control panels to achieve visual display of flight parameters and functional control of the environment.2.Complete the research on the imaging target detection technology of the airborne visible light sensor.Firstly,according to the characteristics of the small size of the target imaged by the drone on the ground,the key issues of small target detection were researched,and the characteristics of small target samples in the military field were focused.The 3D rendering engine technology is used to design the generation of samples Program.Then use a variety of data enhancement techniques to enhance the original sample to enhance the generalization ability of the network model.Finally,a variety of technical methods are used to reasonably control and optimize the training process to achieve efficient training of the model.3.Complete the research of 3D perception,understanding and visualization in display mode.Firstly,using the differences in sensing characteristics of different sensors,a multi-sensor fusion experiment was designed to complete the perception of complex degraded visual environments.Then researched the terrain prompting and warning technology.Through different shades of the terrain,the dangerous terrain was highlighted;the obstacle highlighting technology represented by box highlighting,color highlighting and distance marking was researched to realize the understanding effect of the three-dimensional scene.Finally,assisted driving technologies such as two-dimensional electronic maps,path planning,safe flight paths,landing landmarks,and automatic cruise are researched.Through the high integration of information technology,semi-autonomous flight of drones is achieved,which reduces the drone operator ’s workload.4.Complete the research of 3D perception,understanding and visualization in application mode.Firstly,based on the characteristics of the sensor source in the application mode,an image fusion based on inverse coordinate reasoning is designed to realize the fusion perception of the environment.Then the integrated neural network detection framework is used to complete the detection,highlighting,and annotation of visible light images to achieve a preliminary understanding of the scene.Then,multiple semantic understanding of the scene is realized based on geometric reasoning.Finally,a three-dimensional visualization of scene understanding is designed and implemented. |