Font Size: a A A

Research On The Construction Of Curriculum Subject Knowledge Graph For Individualized Learning

Posted on:2022-11-12Degree:MasterType:Thesis
Country:ChinaCandidate:Z D GuoFull Text:PDF
GTID:2507306779475744Subject:Computer Software and Application of Computer
Abstract/Summary:PDF Full Text Request
When users study a course,the system and context of knowledge points often rely on manual experience summarization.Users lack a problem-oriented approach to knowledge learning,and cannot conduct fine-grained induction and summary of course knowledge points according to the actual needs of learners.The knowledge graph is one of the most effective knowledge representation and integration methods in the era of big data.This research mainly explores the task of building a knowledge graph for personalized learning.The specific work is as follows:1.Aiming at the learning of lexical semantic representation of coarsegrained course knowledge points,a BERT-based domain word vector generation method is proposed.First,a BERT-CRF domain tokenizer is established,and based on the pre-trained BERT word vector,the domain text is combined with fine-tuning and domain word segmentation learning;then the domain word vector representation is further obtained through the domain word segmentation decoding result.Experiments show that the method can learn a tokenizer model that meets the requirements of the domain task using only a small amount of domain text,and can obtain higher-quality word vectors in the curriculum knowledge domain than the original BERT.2.Topic models only utilize statistical features such as word frequency,ignoring the help of external prior knowledge to acquire topics.Aiming at the task of course topic analysis,a tree-structured neural topic model based on BERT embedding and knowledge distillation is proposed by integrating the idea of transfer learning.First,the BERT-CRF model is used to obtain coarsegrained domain word embeddings to alleviate the mismatch between wordgrained BERT embeddings and bag-of-words representation;secondly,for the problem of sparse data in bag-of-words representation,knowledge distillation is used to combine pre-trained BERT and topic models to achieve the best results.performance to improve topic quality;finally,a tree-structured neural topic model is optimized to fit auxiliary information-rich BERT word embeddings,and supervised distillation knowledge is used to guide the document reconstruction of the unsupervised topic model.Experiments show that the tree-structured neural topic model based on BERT embedding and knowledge distillation has the excellent characteristics of pre-training model and topic model,and can summarize the course topics more effectively.3.In order to build a course topic knowledge map that fully expresses user acquisition intentions,combined with social network analysis methods,the topic results of the tree structure neural topic model are further screened,and an interactive personalized learning prototype system is built to enhance users’ personalized learning experience.
Keywords/Search Tags:BERT model, Tree-structured neural topic model, Knowledge graph, Knowledge distillation
PDF Full Text Request
Related items