Font Size: a A A

The Research On Emotion Recognition Based On Multi-modal Physiological Signals

Posted on:2023-09-22Degree:MasterType:Thesis
Country:ChinaCandidate:M J LiangFull Text:PDF
GTID:2530306821992939Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Emotion is not only the physiological state that integrates various human feelings,thoughts,and behaviors,but also the psychological and physiological responses produced by various external stimuli.EEG signals are widely used in emotional recognition due to their high temporal resolution,non-invasive and poor real-time performance.Emotional activity involves the interaction and transmission of information across multiple regions of the brain.The minimum spanning tree can study the dynamic state of the brain during emotion processing tasks,helping to understand how the complexity of brain networks changes with emotional states.In recent years,eye-movement tracking technology has also become an important technology in many fields such as psychology and cognitive science.Eye-tracking signals became more readily available.Subconscious behavior can be determined by identifying the user’s focus.Emotional representation includes external behavioral performance and internal physiological arousal,both of which reflect emotional state changes from different aspects.The EEG data of a single modality cannot fully and accurately characterize the emotional state of a person,but the eye movement data collected synchronously with it can help to improve the effective information related to emotion to a certain extent.Therefore,by using the complementarity of multimodal signals to emotional state representation,it is expected to build a more accurate emotion recognition model and conduct valuable explorations in emotion classification feature selection.Based on the above situation,this dissertation introduces functional connectivity indicators,minimum spanning trees,and eye movement signals into emotion recognition research.Using the SEED-IV dataset covering EEG signals and eye movement signals provided by the BCMI Laboratory of Shanghai Jiaotong University,a new emotion recognition method based on multimodal physiological signals is proposed and achieved good results.The main research work of this dissertation is as follows:(1)The differential entropy feature with stronger emotion representation ability is selected for emotion recognition research,and the phase lag index of the functional connectivity index is introduced to construct the minimum spanning tree under different emotional states,and reveal the subtle differences of brain networks under different emotional states from the topology structure.High-frequency beta and gamma are the key frequency bands of the brain to represent the human emotional state,and the left and right temporal lobes of the brain are the key brain areas for human emotion generation.The brain network shows more random connections in the emotional state of high frequency and high arousal.At the same time,the calculated minimum spanning tree attribute value is separately input into the emotion recognition model as a brain network feature,and the average classification accuracy is about55%;while the average classification accuracy of the individual differential entropy features is stable at more than 75%,and the frequency domain features After the differential entropy and brain network feature minimum spanning tree attribute multi-feature fusion,the average classification accuracy is further improved,ranging from 78% to 85%.The results show that differential entropy and minimum spanning tree are complementary EEG features,and multifeature fusion can interact with the emotional information contained in them,thereby helping to improve the effect of emotion recognition.At the same time,through the retest experiment of the collected data,it is found that the EEG-based emotion recognition model has certain stability over time.(2)The eye movement features were introduced into the research of emotion recognition,and the representation characteristics of different eye movement features on emotional states were analyzed utilizing statistical testing.Among them,the pupil diameter of the left and right pupils has the most significant difference under different emotional stimuli and has a stable pattern.The pupil diameter of positive emotion is smaller than that of negative emotion,and the pupil diameter of neutral emotion is the smallest;the gaze duration and the scanning frequency are negatively correlated,and the gaze duration of the neutral emotion is the longest;the gaze frequency of the positive emotion is the highest and the blink frequency of the negative emotion highest.At the same time,the differential entropy feature of pupil diameter in eye movement data is extracted by using the method of signal processing.The results show that after merging the above-mentioned different types of eye movement features,it has the same emotion recognition ability as EEG features,and its average classification accuracy reaches78.10%.(3)EEG and eye movement signals were selected for emotion recognition research of multimodal physiological signals,and feature-level fusion strategy of serial splicing was used to construct feature matrix,and individual modal features with significant differences were selected as classifiers by two-sample T test.It can effectively improve the accuracy of emotion recognition by using the complementarity of different modal signals.The results show that the classification effect of multimodal physiological signals is better than that of single-modality EEG or eye movement signals.Taking eye movement features as the benchmark,the fusion of differential entropy features and minimum spanning tree attributes is more prominent,and its average classification is accurate.The average classification accuracy rate is as high as 91.43%,and the average classification accuracy of only fused differential entropy features or minimum spanning tree attributes is 89.90% and 83.29%,respectively.Compared with similar studies,the average classification accuracy has also been improved,proving the effectiveness of the multimodal emotion recognition model.
Keywords/Search Tags:emotion recognition, multimodality, minimum spanning tree, differential entropy, eye movement, EEG
PDF Full Text Request
Related items