Font Size: a A A

Emotion Recognition Research Based On Electroencephalogram Signal

Posted on:2022-08-09Degree:MasterType:Thesis
Country:ChinaCandidate:Y T X OuFull Text:PDF
GTID:2480306338486684Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Emotion is the basis of human beings and is usually associated with logical decision-making,perception,and interpersonal communication.It plays an important role in human cognition.In recent years,with the rapid development of human-computer interaction applications in computer-aided fields,giving computers the ability to recognize emotions has gradually become a research hotspot.Emotion recognition provides a bridge of emotional interaction between human and computer,and endures computer with the ability of detecting,processing,and replying to human emotions.As a signal of the central nervous system,EEG is closely related to the emotional activities of the brain,and has become the main means of emotion recognition.As a high-level activity of the brain,emotion arousal depends on the coordination of various brain regions to complete.Emotion is also sparse and infrequent.EEG signals are continuous sequence signals distributed in different brain regions,while emotional information may only be distributed significantly in certain time segments or brain regions.How to model the information interaction between brain regions and pay attention to the significant emotional segments and brain regions is the key to extract efficient emotional features and improve the accuracy of emotion recognition.Secondly,in some key scenes,the human-computer interaction system can also collect some other external behavior data to comprehensively describe human emotion changes.How to integrate the EEG signal with the emotion information of other modals in the multi-modal scene is the core of improving the performance of emotion recognition.Aiming at the above problems,this paper studies the unimodal and multi-mode emotion recognition based on EEG signals respectively,and the main work is as follows:In this paper,a spatiotemporal convolutional neural network-based EEG emotion recognition model is designed for the classification of single mode emotional EEG.The model solves the problem of emotional sparsity by introducing the spatiotemporal attention mechanism,and mining the functional connections between electrode channels at different locations according to the adaptive learning EEG topological connection matrix.Then,feature learning based on graph structure is carried out through graph convolution operation,and finally emotion prediction is completed.In addition,a multi-view fusion strategy is designed by adding different feature spaces for each node in the graph structure to fuse multiple features of EEG signals.In this paper,the emotion recognition results of the spatiotemporal graph convolutional network model are analyzed and compared under the single view and multi-view features,and many experiments are carried out on various network parameters.The results show that the proposed model has a great advantage over the traditional classification model in the accuracy of emotion recognition,and the average accuracy of emotion recognition reaches 84.91%,which is a certain improvement compared with the traditional method.An emotion classification model based on multi-layer self-attention mechanism is proposed for multi-modal emotion recognition of EEG and eye movement signals.This model is modified to solve the problems of multi-modal feature fusion and pattern recognition in multi-modal emotion recognition.1)Enhance inter-modal and intra-modal emotion information interaction.2)According to the correlation differences between different emotion categories and different modal features,the affective correlation matrix is constructed to strengthen the highly correlated features and enhance their influence on emotion classification.The final experimental results show that the multi-modal emotion recognition model designed in this paper has a certain improvement in accuracy compared with the baseline model,achieving an average emotion classification accuracy of 91.09%.
Keywords/Search Tags:electroencephalogram(EEG), emotion recognition, multimodal fusion, graph convolutional network, attention mechanism
PDF Full Text Request
Related items