Font Size: a A A

Affective Brain-Computer Interactions

Posted on:2019-07-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:W L ZhengFull Text:PDF
GTID:1364330590970375Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Emotion plays an important role in human and human communications in our daily life.Besides logical intelligence,emotional intelligence is considered as an important part of human intelligence,which represents the ability to perceive,understand,and response to emotion.However,the existing human-computer interaction systems still lack emotional intelligence.Affective brain-computer interactions aim to narrow the communication gap between human and machine by developing computational models of emotion.In the thesis,we explore the theoretical basis,models,algorithms,implementation technologies,experimental validation,and prototype applications of affective brain-computer interactions.The main contributions are as follows,1)Multimodal emotion recognition and vigilance estimation with EEG,EOG,and eye movements are developed using deep neural networks.Deep neural networks are adopted to improve the performance in comparison with conventional models.The critical frequency bands and brain regions are revealed and the optimal electrode placement with fewer electrodes and more feasibility is determined for emotion recognition.The neural patterns of three emotions(happy,sad,and neutral)are identified with commonality across individuals and sessions.We have found the following characteristics of EEG signals for emotion recognition: a)the beta band and gamma band responses at the temporal areas increased for happy emotion;b)the neural patterns of neutral emotion and sad emotion were similar;and c)neutral emotion had higher alpha band responses at parietal and occipital sites,while sad emotion had higher delta band responses at parietal and occipital sites and higher gamma band responses at prefrontal sites.2)We propose multimodal emotion recognition framework using EEG and eye movements to model both internal cognitive states and external subconscious behaviors of human with various modality fusion strategies,including feature-level fusion,decision-level fusion,and bimodal deep auto-encoder.The experimental results demonstrate that significant improvements with about 10% accuracy for three emotion recognition and with about 15% accuracy for four emotion recognition were obtained in comparison with single modality.The complementary characteristics of EEG and eye movements are investigated.To deal with individual difference across subjects and non-stationary characteristic of EEG signals,we introduce transfer learning for constructing personalized EEG-based affective models.The transductive parameter transfer algorithm achieves the best performance with 76.31% accuracy and a significant improvement of about 20% accuracy in comparison with the baseline without domain adaptation.3)We develop mutimodal vigilance estimation systems with EEG and forehead EOG in both laboratory stimulations and real driving environments.The novel forehead EOG setup and vigilance annotation using eye tracking glasses are proposed.In order to capture vigilance dynamics with temporal evolutions,the temporal dependency models(CCRF and CCNF)are introduced.We first perform the evaluations of the computational models with the EEG and EOG data recorded by commercial EEG recording systems with wet electrodes.In order to improve the wearability and feasibility of vigilance estimation,we develop a wearable device with flexible dry electrodes and large-scale integrated circuits for the forehead EOG recordings.The experimental results on both laboratory driving simulations and real-world driving environments demonstrate the efficiency of the proposed vigilance estimation system.We develop three publicly available datasets for emotion recognition and vigilance estimation,SJTU Emotion EEG Dataset(SEED)for happy,sad,and neutral emotions,SEED-IV for happy,sad,fear,and neutral emotions,and SEED-VIG for continuous vigilance estimation.These datasets have received more than 300 applications from universities and research institutes all over the world.
Keywords/Search Tags:Affective Computing, Emotion Recognition, Affective Brain-Computer Interactions, Vigilance Estimation, EEG, EOG, Eye Movements
PDF Full Text Request
Related items