Font Size: a A A

Semantic Role Labeling Based On Deep Recurrent Neural Network

Posted on:2021-04-09Degree:MasterType:Thesis
Country:ChinaCandidate:X Z PuFull Text:PDF
GTID:2558306041979139Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of artificial intelligence technology,the rapid development of computer technology and the explosive growth of data volume in the big data era,it is becoming more and more difficult for people to obtain and process information accurately,quickly and comprehensively,especially in the form of text.There have been many researches on semantic role labeling methods,such as traditional methods based on statistical learning.With the rapid development of machine learning,neural networks are more and more widely used in the field of natural language processing,in which recurrent neural networks(RNN)are more and more effective in semantic role labeling tasks.But there are still many challenging problems to be solved,such as deep recurrent neural network with the increase of network depth,gradient disappearance and training difficulties.The accuracy of semantic role labeling is affected by sentence length.By deeply exploring the existing semantic role labeling model,this paper mainly studies from two aspects:Chinese semantic role labeling and multilingual semantic role labeling.The main work done in this paper is as follows:1.To solve the problem of gradient disappearance and training difficulty caused by deep recurrent neural networks with increasing depth,a Highway-BiLSTM-CRF model is proposed to connect between BiLSTM layers with highway network.At the same time,the word embedding such as dependency,distance between predicate and argument are added to the input layer.Finally,the optimal sequence is obtained by fusion of global labels by CRF layers.The experimental results in Chinese Proposition Bank show that when the BiLSTM depth is 8 layers,the performance of Chinese semantic role labeling is the best.2.A joint learning method based on multilingual semantic role labeling based on Self-Attention is proposed to strengthen the relationship between distant words.Joint multilingual learning is achieved by setting up separate input layer and partial hidden layer,and sharing partial hidden layer and output layer at the same time.At the same time,the self-attention mechanism is introduced in the hidden layer to enhance the relationship between predicate and argument,to improve the tagging effect of long sentences,and to enhance the relationship between sequences.Two different training strategies are proposed to solve the overfitting problem of small data sets.The experimental results of CoNLL 2009 evaluation data set show that the multilingual semantic role labeling joint learning method is superior to the monolingual semantic role labeling model.
Keywords/Search Tags:Semantic role labeling, BiLSTM, Self-attention, Highway network
PDF Full Text Request
Related items