Font Size: a A A

Research And Application Of Knowledge Graph Embedding Technology Based On Attention Mechanis

Posted on:2024-08-21Degree:MasterType:Thesis
Country:ChinaCandidate:D H LiuFull Text:PDF
GTID:2568306923484694Subject:Electronic information
Abstract/Summary:PDF Full Text Request
Knowledge graph belongs to semantic knowledge base and represents structured knowledge through triples.The triplet is composed of head entity,relation and tail entity.A triple consists of a head entity,a relation,and a tail entity,where the head entity is connected to the tail entity through the relation.With the development of computer technology,knowledge graph is widely used in natural language processing,machine learning,biology,industry,chemistry and other fields.Although knowledge graph has become a research hotspot,it still faces the following problems:Firstly,knowledge graph is stored in the database in the form of symbol,which cannot be directly applied to various computing tasks;Secondly,most knowledge graphs are incomplete,and the relationship connections between some entities are missing.Therefore,knowledge graph embedding technology comes into being.Knowledge graph embedding technology is the foundation of knowledge graph applications.This technology projects the relations and entities in the knowledge graph into a vector space to obtain vector representations of entities and relations.These digitized vector representations can be directly applied to other downstream tasks,and can also be used for link prediction and completion of missing relations between entities in the knowledge graph.Thanks to the powerful learning ability of neural networks,significant progress has been made in neural network-based knowledge graph embedding research.However,most models only consider extracting interactive features between entities and relationships,ignoring the impact of internal correlations within the triple(between the head entity,relation,and tail entity).Although existing attention-based models calculate the correlation between triples,the model’s performance is not ideal.Based on the above problems,the research content of this paper is as follows:(1)This paper proposes a knowledge graph embedding model based on the multi-attention mechanism,named MAKE.This model utilizes an improved multi-attention mechanism to extract interaction features of triple matrices by calculating the internal correlations of triplets.In order to fully exploit the performance of the multi-attention mechanism,MAKE uses a trainable batch normalization method and a novel composite loss function to improve the model’s learning ability.This paper conducts link prediction experiments with MAKE on the FB15K-237 and WN18 RR standard datasets,and the experimental results show that the model outperforms existing models in various metrics.In addition,the validity of the methods used in MAKE is further proved through ablation experiments.Based on the MAKE,this paper uses graph attention neural networks to aggregate semantic information of knowledge graph structure and designs and implements an encoder method called GAT-MAKE based on graph attention neural networks.Compared with MAKE,GAT-MAKE no longer focuses on the feature information of a single triple,but aggregates node information by calculating the correlation between nodes.This paper conducts link prediction experiments of GAT-MAKE on the FB15K-237 and WN18 RR standard datasets,and the experimental results show that the method performs better than existing models that use graph neural networks as encoders.(2)In order to fully utilize the knowledge graph,this paper design and implement a knowledge graph management platform.The platform not only implements basic knowledge graph visualization functions,but also consists of four modules: graph construction,graph editing,graph search,and graph embedding.In the graph construction module,users can choose different functions to complete knowledge graph construction according to the size of the graph.In the graph editing module,users can add,modify,and delete nodes and connections in the graph using graphical interface or knowledge graph language operation methods.In the graph search module,users can search entities or relationships directly,or complete query search through knowledgebased question and answer.In the graph embedding module,users can use link prediction function to complete the graph or export the knowledge graph embedding vector for other deep learning tasks.
Keywords/Search Tags:Knowledge representation, Knowledge graph embedding, Neural network, Multiattention, Graph attention neural network
PDF Full Text Request
Related items