Using P300-based brain–computer interfaces (BCIs) in daily life should take into account the user’s emotional state because various emotional conditions are likely to influence event-related potentials (ERPs) and consequently the… Click to show full abstract
Using P300-based brain–computer interfaces (BCIs) in daily life should take into account the user’s emotional state because various emotional conditions are likely to influence event-related potentials (ERPs) and consequently the performance of P300-based BCIs. This study aimed at investigating whether external emotional stimuli affect the performance of a P300-based BCI, particularly built for controlling home appliances. We presented a set of emotional auditory stimuli to subjects, which had been selected for each subject based on individual valence scores evaluated a priori, while they were controlling an electric light device using a P300-based BCI. There were four conditions regarding the auditory stimuli, including high valence, low valence, noise, and no sound. As a result, subjects controlled the electric light device using the BCI in real time with a mean accuracy of 88.14%. The overall accuracy and P300 features over most EEG channels did not show a significant difference between the four auditory conditions (p > 0.05). When we measured emotional states using frontal alpha asymmetry (FAA) and compared FAA across the auditory conditions, we also found no significant difference (p > 0.05). Our results suggest that there is no clear evidence to support a hypothesis that external emotional stimuli influence the P300-based BCI performance or the P300 features while people are controlling devices using the BCI in real time. This study may provide useful information for those who are concerned with the implementation of a P300-based BCI in practice.
               
Click one of the above tabs to view related content.