With the rapid increase of the number of motor vehicles,urban traffic congestion,pollution,traffic accidents and other issues frequently occur.Under the increasingly restricted traffic conditions,intelligent transportation technology must be used to improve system efficiency and stability.Traffic monitoring is the basis for studying various types of traffic problems.Traditional traffic monitoring technologies such as induction coils,geomagnetism,and roadside cameras have the limitations of small detection range,single scene,low precision and poor maneuverability.In recent years,the low-altitude platform of unmanned aerial vehicle(UAV)has been favored by researchers because of its diverse collection scenes and wide viewing angles.However,this dynamic and complex low-altitude high-resolution UAV video brings great challenges to traditional traffic information processing methods,especially video detection technology.Traditional vehicle detection technology is not only susceptible to the influence of light and complex background,but also difficult to realize automatic and accurate extraction of traffic parameters.Nowadays,the convolutional neural network has been developed rapidly in the target detection algorithm.This method integrates feature autonomous learning and target fast localization tasks,and has the advantages of strong robustness and high precision.Based on this,this paper proposes a traffic parameter extraction and application method based on UAV video and deep neural network.The specific research contents are as follows.Firstly,according to the characteristics of UAV video,this paper proposes an automatic image correction method,which reduces the impact of the UAV jitter on video acquisition,and keeps the video image coordinate error around one pixel,and ensures the stability and reliability of traffic video data;Then the video data is labeled to generate the training set,and the deep neural network target detection algorithm is used for the model training,so the vehicle detection model with the recognition accuracy of up to 99% is obtained.Because the video scene is complex and variable,the vehicle detection model is difficult to guarantee 100% detection accuracy.Therefore,this paper designs a self-optimized trajectory matching algorithm based on vehicle detection results,which solves the problems of missed detection and classification error,and realizes the automated and accurate trajectory extraction of vehicle in UAV video.Based on the above research content,this paper uses the bidirectional LSTM to learn and model the feature sequence of vehicle trajectory,and realizes 16 kinds of vehicle behavior recongnition with the average accuracy of 94%,and uses the vehicle braking rate parameters to visualize and quantify vehicle conflict behavior,which realizes the automatic traffic safety assessment of road intersections,and the effectiveness of the method is verified by the expert evaluation results.The results show that the deep neural network algorithm can automatically and accurately extract the traffic parameters of UAV traffic video,this method can provide information support for traffic research such as vehicle driving behavior analysis and traffic safety assessment,which has broad application prospects. |