Font Size: a A A

Research On Deep Learning-based Knowledge Tracing Models Based On Self-attention Mechanism

Posted on:2023-05-29Degree:MasterType:Thesis
Country:ChinaCandidate:W ChenFull Text:PDF
GTID:2557306836964309Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Knowledge tracing is an important research direction in the field of educational data mining.Its task is to mine the potential learning rules from the information of learners’ historical learning trajectory,and then the future performance is predicted.Through knowledge tracing,online platforms can master the knowledge state of learners in real time and make personalized learning resource recommendations.At the same time,further analysis of knowledge state can also build a knowledge map,helping the platform to formulate a more reasonable teaching planDeep learning has been paid more attention by many researchers because of its powerful feature extraction ability.At present,current knowledge tracing methods are mainly based on recurrent neural network models,but the long-term dependence limits its performance.Meanwhile,the poor interpretability and lack of learning features are also the main problem in the field of deep knowledge tracing,and the existing models are mainly improved these three problems.Among the improved models,the selfattention mechanism-based model(Transformer)has great advantages that it has no long-term dependency problem,and the attention mechanism is naturally more interpretable.However,Transformer has few applications in the field of knowledge tracing,and there are still limits when applying it to the knowledge tracing task.Firstly,the relative position encoding used in Transformer cannot reflect the forgetting behavior that is so important effect to learning.Without the modeling of forgetting behavior,the prediction of model cannot accurately reflect the knowledge state of learners.Secondly,most of the existing models only use interaction pairs in the form of questions and answers as input,and do not make good use of other characteristic information provided by the dataset(such as question type,answer time,number of attempts,etc.).This affects the performance of the model to some extent.In view of above two problems,this paper proposes a self-attention mechanism knowledge tracing model based on timeenhanced and feature fusion.The specific research content is as follows:(1)Aiming at the problem of that relative position encoding cannot reflect forgetting behavior.In this paper,we propose to use time interval information instead of relative position encoding,considering that there may be crossover between the time intervals,the difference between the end times of adjacent question is used to calculate the time intervals.Interval information is quantified into forgetting factors that decrease with the increase of time interval,and the weight of forgetting factors is balanced by learnable hyperparameters,combine with the calculation characteristics of selfattention mechanism,to make the attention value decrease with the increase of time interval,so as to model the forgetting behavior,and can be interpreted better.Experimental results on four public datasets show that this method can effectively improve the performance of the model and perform better when the data volume increases.(2)Aiming at the problem of lacking learning features.This paper proposed an interaction embedding method that integrates the features of questions.By means of artificial feature engineering,from the rich learning information provided by the dataset,the question features are manually extracted and vectorized,It is then added to interactive modeling based on skill embedding.The time enhancement algorithm and feature fusion algorithm proposed in this paper are applied to knowledge tracing task,the performance of the four public datasets is superior to the four open-source classical models,and the performance of some datasets is improved by about 10%.In addition,additional ablation studies have shown that both algorithms can effectively improve model performance when used independently...
Keywords/Search Tags:knowledge tracing, self-attention mechanism, recurrent neural network, deep learning
PDF Full Text Request
Related items