| Causal relation extraction is a class of relation extraction tasks in natural language processing.Relation extraction is an important subfield of information extraction.The main purpose of relation extraction is to identify entity relationships in various texts.The causal relationship extraction is the cause-effect relationship in the entity relationship.We extract the causal relationship in the text to help improve the efficiency of various downstream natural language processing tasks.And it can be combined with related technologies such as knowledge graph,and applied to various fields by establishing a causal relationship network.At first,people extracted causal relationships by artificially constructing causal relationship models.This method is time-consuming and labor-intensive,and cannot adapt to the current explosive growth of information.With the continuous maturity of deep learning technology and the rise of various neural network models,causal relationship extraction methods have more possibilities.When carrying out the task of causal relationship extraction,there are always problems of small amount of data and large data differences.At the same time,making the machine intelligent cannot avoid generalized learning from small data and continuous learning.Under this demand,people have proposed few-shot learning and lifelong learning methods,which is a step for machines to approach people.Through few-shot learning,it can solve the problem that the amount of causal relationship data is small and it is difficult to train the model well.Combining lifelong learning can also solve the problem that each training is data from different fields,and the causal forms of each other are greatly different.The innovation of this paper is that a causal relationship extraction model is proposed by combining few-shot learning and lifelong learning,which is used to obtain the causal relationship on the causal relationship data set with less labeled data,and improve the model ability through lifelong learning.Better causality extraction.We conduct experiments on four datasets,two of which are small datasets(Causal TB,Event Story Line)and two are large datasets(Sem Eval2010-task8,Sem Eval2020-task2),to verify the effect of the few-shot learning model in our model.The experimental results show the necessity of few-shot learning.We have an improvement of about 40% on small data sets,although the effect is still not as good as the causal relationship extraction effect on large data sets.At the same time,we conduct comparative experiments on several mainstream neural network models,and replace the few-shot models with these mainstream models for comparative experiments.Our models compare with LSTM,BiLSTM,CRF,Transformer,and Bert models on both large and small data sets.With different improvements,this model is feasible and can perform better causal relationship extraction. |