| Humans and animals have the ability to continuously acquire and fine-tune knowledge throughout their lives.For us,the ability to learn from a continuous flow of information is critical to computational learning systems and autonomous agents that play a role in the real world.However,for machine learning and neural network models,continuous lifelong learning is still a long-term challenge.At present,the most advanced deep neural network model has a disadvantage.This model usually learns and trains from fixed batches of training data.Therefore,in general,it does not consider the number of tasks,resulting in insufficient memory for learning training or forgetting of previous training data.This leads to catastrophic forgetting problem.Aiming at the catastrophic forgetting problem caused by using neural network in continuous learning,this paper combines gate function and attention vector mechanism based on traditional gradient descent,and uses evolutionary strategy to optimize the algorithm,which not only alleviates the catastrophic forgetting problem in continuous learning,but also solves the local optimum and redundancy of network structure existing in traditional methods.On this basis,this paper also proposes to use the generated antagonistic network to generate antagonistic samples similar to the current positive samples for training,to solve the problem of missing negative samples in the training process,so as to improve the classification accuracy of current task model learning.These strategies enable the catastrophic forgetting problem to be solved in the classical randomly ordered tasks.Compared with the latest research methods at home and abroad,the forgetting rate and recognition rate of traditional evaluation indicators have achieved the best results. |