| Knowledge Based Question Answering(KBQA)is a question-answering system method that,given a natural language question,uses semantic understanding and parsing of the question,and queries and reasons in a knowledge base to obtain an answer.Compared to search engines,KBQA is more precise and provides a more comfortable human-machine interaction,greatly enhancing the user experience.Dynamic KBQA is a task of performing KBQA on a constantly changing knowledge base.Research on this task will help improve the deployment capability of KBQA in constantly changing scenarios.However,previous research has focused more on learning new knowledge in the current knowledge base,neglecting the memory of historical knowledge in the old knowledge base or only obtaining a small amount of content as a substitute for all historical knowledge.This results in missing historical knowledge in the model’s questionanswering process,thereby reducing its ability to answer historical questions.To address this issue,this paper builds a KBQA model based on incremental learning to dynamically obtain historical knowledge as supplementary information for missing knowledge according to the question and uses knowledge distillation to strengthen the model’s learning of this knowledge,thereby solving the problem of missing historical knowledge and improving the model’s reasoning ability.The specific research includes the following aspects:1)A Historical Knowledge Aware Question Answering Model(HKAM)has been proposed.This model solves the problem of missing historical knowledge when answering historical questions by dynamically acquiring the necessary historical knowledge,to solve the problem while completing the current question answering task.Specifically,HKAM includes a backward reasoning module that supplements missing historical knowledge for answering historical questions based on reverse inference from historical questions and their answers before training the model,and trains the model to memorize this historical knowledge.In addition,two optimization methods have been designed for HKAM to continue using historical knowledge to optimize the reasoning process for historical questions and alleviate the impact of missing historical knowledge on model training through knowledge distillation.Experimental results show that HKAM achieves an average accuracy of 89.82%,69.65%,and 69.82% on three standard datasets,which is about 6% higher than existing large models such as BERT in terms of average precision.2)An Assistant Historical Knowledge Aware Question Answering Model,AHKAM(AHKAM),has been proposed.While addressing the issue of historical knowledge gaps,it also resolves the performance loss caused by the student model in HKAM only being able to learn output from a single teacher.AHKAM designs a multi-teacher-assistant knowledge selection framework,which aggregates the training results of multiple different teacher models to further optimize the improvement of knowledge distillation on the reasoning process.Experimental results show that the model’s average accuracy on the same three datasets is higher than that of HKAM,reaching 91.02%,72.65%,and 73.82%.3)Designed and implemented an incremental learning KBQA system,which solves the problem of constantly repeating training required by existing question answering systems and allows for continuous responses to user queries.The system is based on the HKAM and AHKAM models,and can maintain memory of historical questions while answering the current question,achieving a balance between answering historical and current questions and enabling continuous question-answering in the KBQA system. |