Font Size: a A A

Research On Text Representation Methods With Context Dependent Features Based On LSTM

Posted on:2020-07-18Degree:MasterType:Thesis
Country:ChinaCandidate:C L GaoFull Text:PDF
GTID:2428330572957127Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In order to effectively mine knowledge from non-structural text data,it is the focus of text opinion mining tasks to study the representation appoarch for formalizing natural language into computer input signals.The text representation method based on deep learning has stong ability for learning feature,and the text representation generated by the method has continuous,dense and low-dimensional features.In deep learning,LSTM,a long short-term memory network,is a special RNN model with long-term memory.LSTM not only models the text with time series features effectively,but also avoids the gradient dissipation(or gradient explosion)problem in the process of text modeling.In text modeling,the LSTM-based text representation method treats all input features equally,and does not show the contribution of different features toward specific research tasks clearly.More importantly,the method focuses on learning semantic information in contexts without the structure features.The work in this article is done as follows:1)In the process of modeling the context-dependent features between words,because the vanilla LSTM-based methods cannot use the part-of-speech vector information to strengthen the context-dependent feature information,this thesis proposes an attention-based bidirectional LSTM.Experiments were conducted on open source dataset from the NLPCC&2014 and the self-collected dataset for verifying the effectiveness of the method.The experimental results show that the classification performance obtained by our method on two datasets is better than that of baseline models.2)In the context-dependent feature process between the aspect and the opinion text,because the standard attention mechanism lacks a process of the feature weights correction,this thesis designs a two-stage attention network for modeling context-dependent feature information between entity aspects and opinion texts.The model is evaluated on the dataset from Semeval&2016.The experimental results show that the classification performance of our model is better than that of baseline models.3)In the process of modeling the context-dependent features between topic and opinion text,because the attention-based method of text representation lacks the process of the context-dependent features modeling from the perspective of words,this thesis proposes a novel attention-based aggregation network model.Experiments were evaluated on the English stance classification dataset from Semeval&2016.The experimental results show that the classification performance of our model is better than that of baseline models.
Keywords/Search Tags:Text Opinion Mining, Context Representation, Deep Learning, LSTM, Attention Mechanism
PDF Full Text Request
Related items