| In order to extract key information from amount unstructured data accurately and efficiently,information extraction has gradually developed.As a key component of information extraction,event extraction is used in knowledge graph,automatic summary,question answering system,reading comprehension and other fields.Event trigger extraction is the event extraction important components.There is correlation information between event argument and trigger word,but the existing methods mainly focus on event trigger word itself and do not make full use of the argument information.Utilize the event argument,this paper proposes model based on the fusion of event argument attention and encoder layer.First,the correlation between event argument and event trigger word is calculated to display use the event argument information.Then,the relationship between event argument and event trigger words is indirectly learned through the multi-head self-attention mechanism of encoder layer.The output vectors obtained by the two methods are processed and sent into the encoder layer as features for training.In addition,the word feature model can get more abundant semantic information.The experimental results show that the F value of event trigger words extracted from ACE2005 English corpus is up to 71.95%.On the basis of the extraction model,the event trigger word extraction is divided into two steps: trigger word recognition and classification.By adding the results of model recognition as attention to the classification model,the trigger word weight information is increased.The experimental results show that the F value is 77.01%.Event argument extraction is another important task of event extraction.This paper proposes a fusion model of convolution LSTM network attention and event trigger word attention.Firstly,different local feature representations are obtained by using multiple convolution kernel networks,and then sentence representations of different context features are obtained by BiLSTM.For local feature and sentence representations,the correlation between words is effectively learned by self-attention mechanism,especially the correlation information between event arguments and trigger words,and combine the two attention hidden layer as the output characteristic of convolution LSTM network.In addition,event trigger attention is used to enhance the event argument weight.In this paper,ACE2005 event corpus is used to verify the effect of the model,the F value is 41.13%. |