| In recent years,the outstanding performance of deep learning in many fields such as autonomous driving and smart cities has made it the mainstream direction in the field of artificial intelligence,and convolutional neural networks are an important application branch.Although the theory and technology of convolutional neural networks have been With great progress,more and more advanced models have emerged in the industry,but at the same time more storage and computing resources are consumed.On devices with limited computing resources,such as mobile phones and Io T,which have strict requirements on storage consumption and inference speed,algorithms There is still room for improvement and exploration for the stable,efficient and rapid deployment of the model.Based on the convolutional neural network,this paper uses the quantization algorithm to further compress and deploy it in the ARM CPU,to study the impact of different quantization algorithms on the model performance,and to design and develop an intelligent identification application system for mobile phones.The main research contents are as follows:First,simulate the Gaussian distribution and use the quantization algorithm in the lightweight network Mobile Net to minimize the KL divergence of the data distribution from the perspective of probability and the mean square error of the data distribution from the perspective of optimization,and compare the two quantization algorithms.Impact on quantization parameters,quantization speed,and model accuracy.Second,for the CPU of the mobile ARM architecture,make full use of the ARM NEON instruction and its hardware resources to build a forward inference framework for convolutional neural networks.Third,design and develop an Android mobile phone intelligent identification application,and complete the call to the reasoning framework through the Java Native Interface interface.Using Mobile Net-SSD(Single Shot Multi Box Detector,SSD)as the target detection model,the real machine tests the inference accuracy,time-consuming,storage and other effects of the quantized model.Finally,the Kirin980 chip is used as the mobile hardware platform,and Mobile Net is used as the backbone network classification and target detection model to conduct quantitative experiments.Probability-based quantization algorithms run more than twice as fast. |