People perceive the mind in two dimensions: intellectual and affective. Advances in artificial intelligence enable people to perceive the intellectual mind of a robot through their semantic interactions. Conversely, it… Click to show full abstract
People perceive the mind in two dimensions: intellectual and affective. Advances in artificial intelligence enable people to perceive the intellectual mind of a robot through their semantic interactions. Conversely, it has been still controversial whether a robot has an affective mind of its own without any intellectual actions or semantic interactions. We investigated pain experiences when observing three different facial expressions of a virtual agent modeling affective minds (i.e., painful, unhappy, and neutral). The cold pain detection threshold of 19 healthy subjects was measured as they watched a black screen, then changes in their cold pain detection thresholds were evaluated as they watched the facial expressions. Subjects were asked to rate the pain intensity from the respective facial expressions. Changes of cold pain detection thresholds were compared and adjusted by the respective pain intensities. Only when watching the painful expression of a virtual agent did, the cold pain detection threshold increase significantly. By directly evaluating intuitive pain responses when observing facial expressions of a virtual agent, we found that we ‘share’ empathic neural responses, which can be intuitively emerge, according to observed pain intensity with a robot (a virtual agent).
               
Click one of the above tabs to view related content.