In this study, we propose the use of electroencephalography (EEG), electrooculography (EOG), and kinematic motion data captured through wearable sensors to classify emotional states, while individuals are playing a serious… Click to show full abstract
In this study, we propose the use of electroencephalography (EEG), electrooculography (EOG), and kinematic motion data captured through wearable sensors to classify emotional states, while individuals are playing a serious computer game (Whack-a-Mole). Twenty-one participants wore an OpenBCI headset and JINS MEME eyewear while playing the Whack-a-Mole game at three levels of difficulty. We used a variety of classifiers [i.e., a support vector machine (SVM), logistic regression (LR), random forest (RF), and ensemble classifier (EC)] to classify the participants’ emotional states based on their EEG, EOG, and kinematic motion data. The classifiers were trained using the International Affective Picture System (IAPS). The EC and RF showed the best results in terms of their overall performance. Using tenfold cross-validation for all the subjects, the accuracies obtained were 73% for Arousal and 80% for Valence. Our results suggest that EEG and EOG biosignals, as well as kinematic motion data acquired using off-the-shelf wearable sensors in combination with machine-learning techniques such as EC, can be used to classify emotional states, while the individuals were playing the Whack-a-Mole game.
               
Click one of the above tabs to view related content.