Knowledge graphs organize the facts in the real world in the form of structured triples,which have highly flexible semantic modeling and representation capabilities,and have been widely used in various open domains and specific domains.Knowledge Based Question Answering(KBQA)can automatically parse the natural language questions proposed by the user,understand the user’s intention,locate the triple information(i.e.,knowledge)in the knowledge graph,and return the answer to the user.This method can make up for the shortcomings of traditional search engine,greatly improve the efficiency,and provide a direct human-computer interaction.Therefore,this paper mainly focuses on the key technologies and landing applications of KBQA,and makes improvements and innovations on the basis of predecessors.The specific work is as follows:(1)Aiming at the problem that Chinese characters are complex and semantic information is complex,a pre-trained language model based Knowledge graph one-hop Question Answering method named Chinese BERT-KBQA(Chinese BERT-based Knowledge Base Question Answering)was proposed.Chinese BERT,a pre-trained language model that integrates glyphes and pinyin information,improves the performance of traditional semantic parsing methods on the sub-tasks of entity mention recognition and relation prediction.Specifically,this paper proposes an entity mention recognition model based on Chinese BERT-CRF and a relation prediction model based on Chinese BERT-Text CNN-Softmax,respectively,to comprehensively improve the semantic understanding ability of Chinese texts.Finally,the relevant information between subtasks was combined to predict the final answer.The experimental results on MOOC Q&A one-hop question answering dataset and NLPCC2018 open-domain knowledge graph question answering dataset show the effectiveness of the proposed method.(2)The Chinese BERT-KBQA method only deals with simple questions and cannot perform multi-hop reasoning for complex questions.In order to solve this problem,a multi-hop question answering method based on bilinear graph neural network and two-teacher distillation was proposed,and an educational knowledge Web application was developed on this basis.When the traditional multi-stage semantic parsing method faces complex problems that require multiple reasoning,the error accumulation between multiple steps will lead to the reduction of the accuracy of the question answering method.The introduction of graph neural network can capture the neighborhood information of graph spectral structure and generate vector representations of entities,so that the answer prediction problem can be transformed into the semantic similarity calculation problem between vectors.Based on this idea,this paper combines the bilinear graph neural network for reasoning,and introduces a bilinear aggregator.By combining linear aggregation and bilinear aggregation,the interactive information between graph nodes in the knowledge graph can be modeled more comprehensively,and a more comprehensive entity representation can be obtained,which effectively improves the reasoning ability of multi-hop question answering.In addition,two teacher networks were constructed by combining bidirectional reasoning,and the intermediate supervision signals of the two teacher networks were fused to guide the intermediate reasoning process,so as to alleviate the false path reasoning phenomenon.Compared with the existing multi-hop question answering methods,it achieves better results on the MOOC Q&A and NLPCC-MH multi-hop question answering datasets.The experimental results show that the one-hop and multi-hop question answering methods proposed in this paper improve the accuracy of question answering to a certain extent,and are superior to other question answering methods,and improve the machine’s semantic understanding ability of Chinese text. |