| Text sentiment analysis mainly studies the emotions,opinions,and attitudes people express in texts.Aspect-based sentiment analysis is one of the subdivisions;compared with sentiment analysis,which explores the overall feeling of the text,aspect-based sentiment analysis directly analyzes the sentiment of entities in the text,which has more practical value.The study of sentiment analysis is often based on text feature representation and deep feature extraction.Currently,the mainstream method is to model the physical features in the text based on the attention mechanism.However,there are still problems in the aspect sentiment analysis algorithm based on deep learning and attention mechanism: First,when the current sentiment classification model uses the attention mechanism to calculate the dependence between representations and extract features,most of them fail to make full use of the advantages of local features and global features.At the same time,in the process of fusing local features and global features,use Methods such as splicing or addition can easily lead to the problems of feature loss and feature coverage.Second,the existing methods need a guiding mechanism in the study of aspect word attention,which often leads to the wrong focus of aspect word attention on words that are not syntactically related to aspect words.At the same time,aspect word attention models a single term.It is difficult to understand the general semantic information in the opinion item with the negative prefix.Given the problems existing in the current aspect of sentiment analysis algorithms based on deep learning,this paper does the following research:Aiming at the problem that the model needs to pay more attention to the aspect terms,which leads to insufficient extraction of aspect features,this paper proposes an aspect sentiment analysis model based on dual feature fusion attention.Local and global feature extraction modules are designed to capture the semantic association of aspect words and contexts fully.An improved "quasi" attention is added to the global feature extractor of the model so that the model learns to use subtractive attention in the fusion of attention to attenuate the negative impact of the noise in this paper.Based on conditional layer normalization,the feature fusion structure of local and global features is designed to fuse local and global features better.Experimental results on Senti Hood and Sem Eval2014 Task4 datasets show that the model significantly improves performance after incorporating contextual features.Aiming at the problem that the existing models lack a guidance mechanism for the research on aspect word attention,this paper proposes a capsule network model based on graph convolution and attention mechanism,which combines capsule network and graph convolution,realizes the dynamic routing process of capsule network through external grammatical knowledge focusing attention mechanisms such as syntactic dependency tree and negative prefix,and uses the pre-trained BERT word vector to enhance the feature representation of text and aspect words significantly.Experiments on the ACL-14 Twitter social comment dataset and four benchmark datasets from the Sem Eval competition show that the proposed method performs better. |