| With the improvement of the universal marked database and the improvement of the structure of multi-layer intelligent recognition model,multi-layer intelligent recognition model has shown excellent results in practical projects in various application fields.Although the multi-layer intelligent recognition model has been widely and well used in various fields,model training still needs a certain order of magnitude of accurately labeled data sets and large-scale computing power of matching models.This situation often becomes a technical obstacle in the promotion of practical projects.At the same time,the accuracy of multi-layer intelligent recognition model can not be guaranteed by strict mathematical derivation,which also puts forward the problem of interpretability in engineering experiments.Based on the above problems,model distillation technology is proposed in order to achieve a breakthrough.Different from improving the network structure and manipulating the data set,the model Distillation Algorithm distills the knowledge into the target network from the hidden knowledge of the model,and changes the loss function and training steps at the same time.Based on the idea of model distillation algorithm,this paper discusses the improvement of model Distillation Algorithm on the target model of few-shot recognition and the interpretability of the target model,mainly including the following three dimensions,1、 Aiming at the shortage of training data in image recognition of multi-layer intelligent recognition network,the loss of detailed features and large amount of distillation calculation in multi model distillation algorithm,a model Distillation Algorithm for training weak teacher network under few-shot recognition is proposed.Then,based on the optimized weak teacher network,the combination of network characteristic graphs under multi teacher network is obtained.On this basis,the meta network is used for extrapolation to obtain the model Distillation Algorithm with high accuracy and fast training speed.2、 According to the interpretability of soft decision tree in decision-making,a model distillation scheme for soft decision tree is given.Numerical experiments show that the algorithm can improve the accuracy of the original algorithm and the training time.3、 Based on the interpretability of local distillation to the target black box model,the improvement of local distillation process is given,and the explanation process of the algorithm for the reliability of target network classification is verified by graphic and numerical experiments.Moreover,based on the interpretability of the soft decision tree,the local distillation of the target black box model is replaced by the soft decision tree.The effect of the algorithm on the black box model is verified by image experiments.Aiming at the model Distillation Algorithm of multi-layer intelligent recognition network,starting with the few-shot recognition,this paper discusses the three most representative algorithms.At the same time,these three algorithms are improved in model accuracy,training time,training calculation and model similarity. |