Emotions can be conveyed through a variety of channels in the auditory domain such as the human voice or music. Recent studies suggest that expertise in one sound category can… Click to show full abstract
Emotions can be conveyed through a variety of channels in the auditory domain such as the human voice or music. Recent studies suggest that expertise in one sound category can impact the processing of emotional sounds in other sound categories. We focused here on how the neural processing of emotional information varies as a function of sound category and expertise of participants. Electroencephalogram (EEG) of 20 non-musicians and 17 musicians was recorded while they listened to speech prosody, vocalizations (such as screams and laughter), and musical sounds. The amplitude of EEG-oscillatory activity in the theta, alpha, beta, and gamma band was quantified and Independent Component Analysis (ICA) was used to identify underlying components of brain activity in each band. Sound category-dependent activations were found in frontal theta and alpha, as well as greater activation for musicians than for non-musicians. Differences in the beta band were mainly due to differential processing of speech. The results...
               
Click one of the above tabs to view related content.