| The advent of the information era is often accompanied by an exponential growth of image data,and in order to deeply mine the massive amount of new images,the research of incremental learning-based techniques has important application value.However,in reality,when new image data sets are added,the deep neural network models trained by incremental learning will have the phenomenon of previous training knowledge forgetting,which leads to a sharp decline in the overall performance of the network.Taking the transmission and distribution line defect classification problem as an example,retraining the network model when new defect data sets arrive will cause low training efficiency and a continuous decrease in defect recognition accuracy.To address the above problems,this paper investigates the network model architecture and the differences between the old and new training datasets,respectively,and proposes incremental learning networks based on parallel compression excitation and incremental learning algorithm based on open set learning:(1)Incremental learning networks based on parallel compression stimulus.In this paper,a new network model architecture is proposed for the problem that network models are vulnerable to catastrophic forgetting in the incremental learning process leading to the loss of learned knowledge in old categories.By analyzing the causes of knowledge forgetting in the old categories in the incremental learning model,a multi-branch network architecture is constructed to deeply and automatically mine more feature information in the images to improve the classification accuracy of the model,and an incentive compression module structure is adopted to enable the model to autonomously learn the key information in the knowledge of the old and new categories to improve the classification accuracy of the model for the old and new categories,and thus achieve the solution to the problem of catastrophic forgetting.The experimental results on CIFAR-100,Image Net dataset and transmission and distribution line defect dataset show that the model can substantially alleviate the old knowledge forgetting and effectively improve the incremental learning classification accuracy based on incremental learning without augmenting the network structure.(2)Incremental learning algorithm based on open set learning.In the process of incremental learning,the model training results tend to be biased toward the new added categories due to the lack of old category training sets involved in incremental model training.To address this problem,this paper proposes an open-set incremental learning algorithm based on discrimination augmentation from the perspective of the difference between old and new knowledge.The open-set recognition algorithm is used to learn the difference between old and new knowledge,and the discrimination enhancement strategy is added on top of it to overcome the limitation of incremental learning classifier,to make full use of the difference between old and new knowledge without taking up memory,and to help the incremental model better overcome the problem of catastrophic forgetting.The algorithm proposed in this paper can be applied to most classes of incremental learning algorithms.Experimental results on public datasets and transmission and distribution line defect datasets show that the algorithm is able to improve the classification ability of deep incremental learning and has very high potential and room for improvement. |