| The rapid development of the Internet has brought about a large amount of unstructured textual information,and how to automatically extract specific content from the huge amount of textual information is an important research direction in the field of natural language processing.As one of the sub-tasks in the field of natural language processing,relation extraction has become a hot research topic.With the application of deep learning technology in the field of relational extraction,the relational extraction task has achieved good results,but there are still some problems that need to be solved.1.In the fully supervised relationship extraction task,a relationship extraction model based on BERT pre-training model and improved capsule network is constructed to address the problems of loss of spatial location information and inadequate extraction of entity features in the mainstream relationship extraction model.The data is first pre-processed with special characters to mark the entity locations,and the vector representation of the entities is obtained by BERT.The extraction of entity information is then achieved through the feature extraction layer of the improved capsule network,and then the low-level features are clustered using a dynamic routing mechanism to obtain classification results for the relational labels.The model is tested on the TACRED and Sem Eval-2010 Task 8 datasets,and the experimental results show that the performance metrics of the improved model outperform the baseline model in the comparison experiments.2.In the remote supervised relationship extraction task,a combined model of reinforcement learning combined with improved capsule network is proposed to address the problem of noisy labels generated in the remote supervised relationship extraction task,using a sentence selector with reinforcement learning as the framework to remove noisy sentences from the dataset,passing the cleaned dataset into a relationship classifier with an improved capsule network as the backbone to obtain extraction results,and giving sentence The sentence selector is rewarded according to the results.The two modules work together to complete the remotely supervised relation extraction task.A self-attentive routing mechanism is also introduced into the model to alleviate the inefficiencies caused by iterative parameter updates of the capsule network.The model was tested on the public dataset NYT-10 and experimental results showed that the improved model outperformed the baseline model of comparison. |