Font Size: a A A

Research On Fine-grained Text Sentiment Analysis Based On Deep Learnin

Posted on:2022-09-24Degree:MasterType:Thesis
Country:ChinaCandidate:Z L HuFull Text:PDF
GTID:2568307055454884Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of information technology,a large number of emotionally inclined content has been generated on social media and e-commerce platforms.The sentiment analysis and statistics of these text data are of great value to the fields of society and business.Sentiment analysis has become one of the most popular tasks in the field of natural language processing research.Early sentiment analysis research mainly focused on coarse-grained sentiment analysis,which believed that a document or sentence contained only one kind of sentiment.However,in actual situations,a piece of text contains multiple entities or multiple aspects of a specific entity,and different aspects may have different sentiment tendencies.Fine-grained sentiment analysis can obtain different aspects of sentiment information.Therefore,fine-grained sentiment analysis tasks have received extensive attention from the academic community in recent years,and have extremely high research and application value.The previous fine-grained sentiment analysis methods are difficult to more effectively focus on the sentiment information contained in key words,and ignore the connection between aspect words and context.In order to solve the above-mentioned problems,the research of this article was carried out.This paper proposes a sentiment analysis model based on feature fusion and selfattention mechanism,namely the FF-ATT model.This model uses bidirectional gated neural units to extract features from the front and back directions,extract the bidirectional semantic information of the text,and obtain More accurate contextual representation reduces the training time of the model.Considering that not all words have the same importance in sentiment classification,in order to consider the influence of different words on the result of sentiment classification during fusion,the selfattention mechanism is used to reasonably assign the weights corresponding to the words according to the importance of different words.In addition,in the fusion strategy,the attention score of the aspect word and the context is calculated first,and then the output of the attention weight vector is spliced to obtain the final fusion vector,which is sent to the sentiment classifier for classification.The FF-ATT model was tested on the three datasets of Restaurants14,Laptop14 and Twitter,and the accuracy and F1 value of the two evaluation indicators were improved by about 1% compared with other baseline models.In addition,this paper uses the BERT pre-training word vector model and the multi-head attention mechanism to optimize the FF-ATT model,and proposes the BERT-DATT model.The model uses BERT to enhance the feature representation of context and aspect words,and obtains deeper semantics.At the same time,a multi-head attention mechanism is introduced to capture the dependence of specific aspect words and context,and provide a more accurate feature representation for sentiment classification.The BERT-DATT model has an accuracy rate of 83.41% on the Restaurants14 data set,and good results have been obtained on other data sets,verifying the effectiveness of the model.
Keywords/Search Tags:Fine-grained, Sentiment analysis, Attention Mechanism, Deep learning, Natural Language Processing
PDF Full Text Request
Related items