Font Size: a A A

Research On Emotion Transmission Model Based On Multi-Modal Analysis

Posted on:2023-11-23Degree:MasterType:Thesis
Country:ChinaCandidate:H XingFull Text:PDF
GTID:2558306905970129Subject:Engineering
Abstract/Summary:PDF Full Text Request
Since the 20 th century,Internet technology has developed rapidly.At the same time,communication technology has made rapid progress.People use mobile smart terminals to log in to social networks.Social networks have become an important tool for people to communicate and express various emotions and opinions in daily life.People share their daily life and their opinions and emotions on these social network platforms.Most of the existing research on sentiment analysis of online social network users is to analyze and identify the sentiment carried by single-modal information.In the face of the increasing proportion of multi-modal information on social networks,they cannot do well to describe the user’s emotions in the social network.On the other hand,the existing social network emotional communication model is relatively simple,which directly combines emotions and information transmission rules,but ignores the close relationship between users in online social networks,and does not take into account the effect of multi-modal information on emotional communication.A neural network model called HFBM for sentiment classification and prediction based on multi-modal data is proposed in this thesis.The model uses a convolutional neural network to extract features of the image modal information,and uses a pre-trained Bert model to extract features of the attribute modal and text modal.Finally,the attention mechanism is used to perform feature fusion on the multi-modal feature vector,and finally the feature vector is classified by Soft Max to obtain the emotion classification result.This article compares the experimental effects of HFBM and the existing model on the same data set,and points out the improvement of HFBM from multiple measurement perspectives(such as accuracy,recall,precision,F1 score).A multi-modal emotional transmission model called TDMM-SIR is also proposed in this thesis.It is based on the three-degree influence theory.The basic idea of the algorithm is based on the principle of three-degree influence and the principle that multi-modal information is more appealing.In addition,every time the influence spreads outward once,the influence will be attenuated on the existing basis.In addition,multi-modal information is more infectious and empathetic,and the information between multiple modalities complements each other,which can make users who see this multi-modal information empathize and be infected.Finally,the basic SIR emotional transmission model,the SIR emotional transmission model based on threedegree influence(TD-SIR),and the multi-modal emotional transmission SIR model based on three-degree influence(TDMM-SIR)are compared in the online social network.This paper also visualizes the changes in the emotional state of each node in the entire emotional transmission process from the microscopic view and depicts the changes in the proportion of people of each emotional type in the entire emotional transmission process from the macroscopic view.
Keywords/Search Tags:social network, multi-modality, sentiment analysis, sentiment transmission model, three-degree influence
PDF Full Text Request
Related items