Font Size: a A A

Research On Consistency Of Semantic Parsing Based On Knowledge Distillation

Posted on:2024-04-09Degree:MasterType:Thesis
Country:ChinaCandidate:J ZouFull Text:PDF
GTID:2568307091965369Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Knowledge Base Question Answering(KBQA)is an important component of question answering systems.KBQA methods based on Semantic Parsing(SP)are becoming increasingly popular due to their powerful reasoning ability and interpretable reasoning process,and are currently a hot research topic.There are multiple logical forms for semantic parsing,such as Ko PL,SPARQL,and Lambda-DCS,each with its own advantages and some semantic inconsistency issues.To address this problem,this paper introduces knowledge distillation and focuses on the idea of model enhancement,utilizing its knowledge transfer ability to improve semantic consistency between different logical forms and fully explore the model’s learning ability,thereby enhancing the performance of the semantic parsing model.The main work is as follows:(1)In response to the diverse advantages and partial semantic inconsistencies present in different logical forms,we propose a dynamic knowledge distillation framework for semantic parsing(DKD-SP)based on single-teacher.This paper uses a logic form parsing model as a teacher to guide the learning and training of a logic form parsing model as a student using its generated sample labels(output feature knowledge)and encoding layer implicit vectors(intermediate feature knowledge),thus realizing knowledge transfer between different logical forms.To enable the student model to learn from the teacher model,a student dynamic distillation learning model is proposed to learn from the teacher-guided model in three ways: aligning the output distribution,aligning hidden layer representations,and a combination of the two alignment methods.Finally,a dynamic weight assignment model is proposed to ensure the student’s learning performance and allocate weights for supervised signals.(2)In response to the issue of varying abilities of different logical forms in handling different types of problems,we propose a dynamic knowledge distillation framework for semantic parsing(MKD-SP)based on multi-teacher.This paper leverages the idea of cycle generation and concatenates the entities and relationships in the student model’s output to the original natural language question,which is then input into the teacher model,with the teacher model’s final logic form guiding the student model’s learning,thereby enabling the teacher model to teach according to the student model’s abilities.In multi-teacher teaching,other teachers may have "wrong teaching," so this paper proposes combining self-distillation in multi-teacher knowledge distillation and using hidden layer representation alignment for teaching to further reduce the aforementioned problem.Finally,a Confidence-aware Weighting module is proposed to allocate weights for supervised signals.(3)We conduct multiple experiments on the KQA Pro dataset,which is currently the most extensive corpus of NLQ-to-SPARQL and NLQ-to-Ko PL.The experimental results demonstrate that the proposed DKD-SP method,when compared to the state-of-the-art Bart model with the highest performance,achieves an average improvement of 0.57% in accuracy and a significant improvement of 4.02% in composite generalization scenarios,thus attaining the state-of-the-art(SOTA)performance.Additionally,the effectiveness of the proposed MKD-SP method,when compared to the state-of-the-art Bart model,exhibits an average improvement of 0.75% in accuracy.This demonstrates that utilizing knowledge distillation for mutual learning between different logical forms is more effective than using a single logical form semantic parsing method.
Keywords/Search Tags:knowledge graph, knowledge base question answering, natural language processing, semantic parsing, knowledge distillation, consistency
PDF Full Text Request
Related items