Current databases of facial expressions represent only a small subset of expressions, usually the basic emotions (fear, disgust, surprise, happiness, sadness, and anger). To overcome these limitations, we introduce a… Click to show full abstract
Current databases of facial expressions represent only a small subset of expressions, usually the basic emotions (fear, disgust, surprise, happiness, sadness, and anger). To overcome these limitations, we introduce a database of pictures of facial expressions reflecting the richness of mental states. A total of 93 expressions of mental states were interpreted by two professional actors, and high-quality pictures were taken under controlled conditions in front and side view. The database was validated in two experiments. First, a four-alternative forced-choice paradigm was employed to test the ability to select a term associated with each expression. Second, the task was to locate each face within a 2-D space of valence and arousal. Results from both experiments demonstrate that subjects can reliably recognize a great diversity of emotional states from facial expressions. While subjects’ performance was better for front view images, the advantage over the side view was not dramatic. This is the first demonstration of the high degree of accuracy human viewers exhibit when identifying complex mental states from only partially visible facial features. The McGill Face Database provides a wide range of facial expressions that can be linked to mental state terms and can be accurately characterized in terms of arousal and valence.
               
Click one of the above tabs to view related content.