Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and… Click to show full abstract
Emotion recognition in real-time using electroencephalography (EEG) signals play a key role in human-computer interaction and affective computing. The existing emotion recognition models, that use stimuli such as music and pictures in controlled lab settings and limited number of emotion classes, have low ecological validity. Moreover, for effective emotion recognition identifying significant EEG features and electrodes is important. In our proposed model, we use the DEAP dataset consisting of physiological signals collected from 32 participants as they watched 40 movie (each of 60 seconds) clips. The main objective of this study is to explore multi-domain (time, wavelet, and frequency) features and hence, identify the set of stable features which contribute towards emotion classification catering to a larger number of emotion classes. Our proposed model is able to identify nine classes of emotions including happy, pleased, relaxed, excited, neutral, calm, distressed, miserable, and depressed with an average accuracy of 65.92%. Towards this end, we use support vector machine as a classifier along with 10-fold and leave-one-out cross-validation techniques. We achieve a significant emotion classification accuracy which could be vital towards developing solutions for affective computing and deal with a larger number of emotional states.
               
Click one of the above tabs to view related content.