| The foundation for intelligent agricultural machinery to operate automatically is environmental perception.The achievement of correct automatic navigation and obstacle avoidance by intelligent agricultural machinery is aided by the accurate perception of the surrounding environment of the field roads.The main environmental perception sensors carried by intelligent agricultural machinery are millimeter wave radar and machine vision.Millimeter wave radar has the advantage of detecting target motion information and is less affected by weather and lighting conditions.However,it can hardly recognize target types and contours,and the output data often contains a large amount of interference data.Machine vision can obtain rich scene information and has the advantage of detecting target types and contours.However,it’s difficult to obtain target motion information,and it is greatly affected by weather and lighting conditions.A method is proposed in this paper for target detection in field roads using information fusion between millimeter wave radar and machine vision,in response to the problem of complex environments and diverse targets in hilly areas,where a single sensor can’t comprehensively and accurately obtain information such as target category,contour,and motion state.This method can fully leverage the advantages of each sensor’s target detection,improve the detection ability of the perception system for target features,and help intelligent agricultural machinery to comprehensively and accurately perceive the surrounding environment.The main research contents and conclusions of this article are as follows:(1)Building an environmental perception system.On the experimental platform of the tracked automatic driving transport vehicle,the SR73 F millimeter wave radar and USB undistorted industrial camera are installed as the hardware components for the study of environmental perception.The high-performance PC is used as the data processing system for the millimeter wave radar and camera,and various hardware and software are integrated to build the environment perception system for the tracked automatic driving transport vehicle.(2)Processing of millimeter wave radar data for field roads.This article presents a methodology to filter out some interfering data by using distance threshold,velocity threshold,angle threshold,and RCS threshold to deal with noise information in the target data returned by the millimeter wave radar.The DBSCAN clustering algorithm is studied for clustering targets with similar relative lateral and longitudinal distances and relative longitudinal velocities.A multi-object tracking algorithm is utilized to further filter the data of millimeter wave radar and track moving targets.Finally,the effectiveness of the data processing method for millimeter wave radar is verified through simulation and offline experimental data.It can provide stable and accurate target data for subsequent data fusion.(3)Image semantic segmentation for field roads.In response to the limitations of traditional machine vision algorithms in accurately detecting and segmenting targets on field roads,deep learning algorithms are used for the detection and segmentation of targets on field roads.Based on the analysis of the characteristics of field road images,the targets in the field road images are classified into 10 categories,including "background,road,pedestrian,fence,sky,signboard,pond,obstacle,building," and a dataset of field road environment is established.This article proposes an improved semantic segmentation network based on Deeplabv3 plus,mainly by replacing the backbone network from Xception with G_Reg Net X8 and adding an ECA module in the ASPP structure to improve the inference speed and segmentation accuracy of the model.Finally,the improved Deeplab v3 plus semantic segmentation network is validated by using the field road environment dataset,and the test results showed that the PA,MPA,and MIOU of the improved model are 97.64%,91.59%,and 85.81%,respectively.When inferring images with resolutions of 480x480 and 520x520,the FPS of the network are29.08 FPS and 26.76 FPS,respectively.The results show that the improved semantic segmentation network not only has a relatively high segmentation accuracy but also achieves a relatively fast inference speed.(4)Information fusion of millimeter wave radar and vision.The synchronization of target data of millimeter wave radar and visual target data in time and space is achieved through time stamp alignment and coordinate transformation based on the least squares method.The fusion result of target feature information is obtained by matching the radar targets and camera targets through a matching strategy.Finally,the effectiveness of the fusion strategy is verified through offline experimental data.Compared to using only the millimeter wave radar algorithm,the pedestrian detection accuracy of the fusion strategy has increased by 4.17%;compared to using only the camera algorithm,the pedestrian detection accuracy has increased by 19.53%.The perception system’s robustness for object detection in poor lighting conditions is enhanced.(5)Visual guide line based on information fusion.To address the issues of internal defects in the road area and protrusions or discrete points on the road boundary in semantic segmentation predicted images,morphological filtering is used to obtain a more complete road area and smoother road boundary.The centroids of the segmented road are obtained through connected domain processing.The least squares method is adopted to fit the centroids of segmented road blocks to obtain the specific mathematical expression of the visual guidance line.The proposed visual guide line extraction method is tested in experiments.In scenarios without dynamic obstacles,the average error of the extracted visual guide line is 1.32% to 5.66% compared to the manually measured true road centerline.In the scenarios with dynamic obstacles,the proposed method can successfully extract the visual guide line for obstacle avoidance in advance.The method for extracting visual guidance lines meets the obstacle avoidance requirements for autonomous driving of tracked field road transport vehicles. |