Font Size: a A A

Research On Contextual Implicit Sentiment Analysis At Multilingual Granularit

Posted on:2024-08-13Degree:MasterType:Thesis
Country:ChinaCandidate:Y HanFull Text:PDF
GTID:2568307106975709Subject:Electronic information
Abstract/Summary:PDF Full Text Request
In recent years,with the rapid development of the Internet,many comments without emotional words have emerged on various platforms,but they still clearly express emotional polarity.The research on these comments is called implicit emotional analysis,which can promote commodity sales and public opinion analysis,so it is necessary to carry out implicit emotional analysis research.Based on deep learning,this paper constructs a model from three language granularities of words,sentences and texts to solve the problem of how to accurately extract implicit emotional features in context.The specific work and innovation are as follows:(1)The existing aspect-level implicit sentiment analysis model loses aspect word information in the pre-training process,and can’t accurately extract deep feature information.In order to solve these problems,this paper proposes a deep bi-directional gated recurrent unit context-aware attention aspect-aware BERT(DCAB)model.Firstly,the model constructs aspect-aware BERT,generates dynamic word vectors related to aspect words,and inputs them into DBGRU for coding.Then,it constructs context-aware attention mechanism to extract the deep feature information related to a given aspect word in the context.Experiments show that the accuracy of DCAB on the data sets Rest_ISE and Lap_ISE is improved by 2.60% and 1.28%respectively compared with the optimal baseline model BBA.(2)Although the DCAB model avoids losing aspect word information and accurately extracting deep feature information in the pre-training process,it fails to accurately extract contextual semantic and syntactic structure information at the sentence level,nor can it capture the granularity information of words at the same time.To solve these problems,this paper proposes a context-roberta bi-attention and graph convolution network multitask learning(CRBGM)model.Firstly,the model constructs a context-aware Ro BERTa model to extract important semantic features;Then the syntactic structure information of dependency tree is extracted by GCN.Extracting implicit feature information at word and sentence level by double-layer attention mechanism.Experiments show that the accuracy of CRBGM in data sets VUA-E,Tro Fi-E and MOH-E is improved by 3.07%,2.57% and 1.81% compared with the optimal baseline model BBGAM.(3)Although the CRBGM model accurately extracts semantic and syntactic structure information and captures sentence information at the same time,it can’t fully extract the characteristic information of different emotional polarities in contextual sentences at the text level,nor does it take into account the emotional characteristics contained in the global information.To solve these problems,this paper proposes a context multi-polarity orthogonal attention interaction mechanism(CMAIM)model.The model uses BERT pre-training model and Bi LSTM encoder,and uses contextual multi-polarity orthogonal attention mechanism to capture sentence-level feature information related to three emotional polarities in target sentences and contextual sentences.Finally,the emotional attention mechanism is used to capture the global semantic information in the text.Experiments show that the accuracy of CMAIM in data sets VUA-E,Tro Fi-E and MOH-E is improved by 1.07%,2.98% and 1.29%compared with the optimal baseline model MPOA.
Keywords/Search Tags:Implicit Sentiment Analysis, Language Granularity, Deep Learning, Context Information, Attention Mechanism
PDF Full Text Request
Related items