| This paper addresses the first issue in generating implicit emotional tags-emotion detection. Firstly, the raw EEG data coming from DEAPdataset which were collected when subjects were watching music videos were preprocessed to remove noises; then we apply three non-linear dynamic features:the Shannon entropy, the correlation dimension and the CO complexity of different time sliding windows to calculate four groups of EEG features; then the principal component analysis was used for features selection; finally, we apply k-Nearest Neighbor (KNN) and Gaussian naive Bayes(GNB) classifier to learn and classify emotional states. Experimental results show that:(1) classification accuracy can be significantly improved through feature selection;(2) the classification accuracy of4s without overlapping time sliding window is best, KNN outperforms GNB by offering the maximum average accuracy of the valence and arousal with92.5%and92.6%,respectively. This paper provides a reliable guarantee for implicit emotional tagging of music videos based on EEG.At the same time, we found that existing EEG data format is unable to meet the needs of platform-and applications-independent storage and exchange EEG data, thus, an XML based emotional EEG data format named eegML is proposed. Significant experiment factors as well as upper features data calculated from raw EEG signals are integrated into the eegML, which can significantly promote the EEG data mining and exploring. The design of eegML has significant reference value for the platform-and application-independent storage and exchange of other physiological signals. |