| Emotions are physiological and psycholo gical activities that everyone produce.Emotion variations have important effects on people’s work efficiency,life attitudes,decision-making and judgment.With the improvement of social intelligence’s level,people hope that human-computer interaction products can have emotional intelligence,that is,perceiving the user’s emotional changes and making appropriate responses.Therefore,exploring the processing mechanism of emotions and improving emotion recognition accuracy are of great significance for promoting the development of artificial intelligence to emotional intelligence and improving the level of intelligence of human society.Among brain imaging technologies,electroencephalogram(EEG)has become one of the main methods of emotion research due to its advantages of high time resolution,low cost and good device portability.Emotion,as a complex high-level psychological and physiological activity,usually requires the coordination of multiple brain regions,and there must be an intrinsic relationship between EEG signals generated by different brain regions in the process of emotion processing.From the perspective of the brain network,this paper focuses on two issues: " how to explain the changing law of the brain network during different emotion processing" and "how to improve the performance of emotion recognition for practical applications" :1.For the problem that both the changes of brain network connection pattern and the difference in information processing efficiency during different emotio n processing were unclear,this study constructed a brain network in different emotion processing processes and analyzed the brain network variations under different emotional states.Steady-state visual evoked potential(SSVEP)has the advantages of high signal-to-noise ratio and high time resolution.Based on the experiment of SSVEP emotion evoking,this paper constructs positive,neutral,negative emotional brain network using coherence method.Through the analysis of the network connection patterns and network attributes of different emotion processing processes,the differences of dynamic neural patterns in different emotional states were explored.The experimental results show that: in the early stage P2(210-250ms)of emotional processing,the network connection coupling of negative emotions is stronger than that of positive and neutral emotions;in middle stages P3(365-450ms)and late P3(365-450ms),the long-range connections between the prefrontal and occipital lobes,and between the prefrontal and temporal lobes showed significant differences between emotions(p<0.05);the late stage SW(500-670ms),the network connection coupling of negative emo tions is significantly lower(p<0.05)than that of positive and neutral emotions;at the same time,the network connection in the emotional state shows a lateralization effect,that is,the network connections with stronger coupling in left frontal lobe are related to positive emotion,and the network connections with stronger coupling in the right frontal lobe are related to negative emotion.Brain networks in positive and negative emotional states have higher information processing efficiency than brain networks in neutral emotional states.This research has important significant for exploring the neural mechanism of emotions,and provides a theoretical basis for subsequent emotion recognition based on EEG network features.2.For the problem that the current EEG frequency domain features are weak in expressing the differences of different emotion categories,this paper proposes an emotion recognition method based on high-frequency brain network features.The proposed method analyzes the local and global brain response patterns under emotions states in more subdivision EEG frequency bands,and finds that the high frequencies features,especially high gamma bands(50-80Hz)features,can more effectively characterize the differences between emotions.Based on the analysis,we propose an emotion recognition method combining local and global features of high gamma frequency band.Experimental results show that in the high frequency bands both local features and global network features show significant statistical differences between emotions,especially the high gamma frequency bands have the highest network connection density,and there is a long distance connection between prefrontal lobe and occipital region;the classification accuracies of the high-frequency bands feature are higher than those of the low-frequency bands feature,and the fusion feature of the high gamma band achieved the highest classification accuracy 87.27% among positive,neutral,and negative emotions.In addition,we achieved an average accuracy of 63.36% for the six categories of emotio n classification on public dataset with the emotion recognition method proposed in this paper has,which is 8% higher than the best results available.Experimental results prove that the method proposed in this paper can make full use of the complementary characteristics of local and global features of the high gamma band,and can effectively improve the accuracy of emotion recognition.3.For the problem that emotion recognition is susceptible to the non-stationarity of EEG signals in practical application,this paper proposes an adaptive classification model to improve the performance of online emotion recognition.Due to the non-stationarity of EEG signals,the statistical distributions of emotional EEG signals collected from the same person at different times are significant different.How to realize time-shifting emotion recognition has become a key issue in online emotion recognition research.In this paper,a 69-person time(23 people each collected EEG in different 3 days)emotional EEG database was established,and then the stable neural model of emotional EEG was analyzed.We use progressive transductive support vector machine(PTSVM)as the basic classification model,then we propose a region labeling rules in the case of three classifications which extends the application of classic PTSVM to three classifications,and we adaptively improved the confidence of the predicted labels by the K-nearest neighbor algorithm and iterative process.So we proposed an adaptive classification algorithm model which could solve the time migration classification problem of EEG.The experimental results show that the EEG features that stably characterize the differences between emotions mainly come from the prefrontal lobe,the temporal lobe on both sides,the left occipital-parietal lobe,the right occipital lobe and the posterior occipital lobe.The accuracy rate of time migration classification for the three emotions of positive,neutral and negative reaches 63.56%,which is higher than the traditional SVM classification mode l,and the accuracy is 4% higher than the existing best adaptive classification model.The results of the study show that the adaptive classification model proposed in this paper can effectively improve the performance of time-shifting emotional EEG classification... |