There is a current claim that humans are able to effortlessly detect others’ hidden mental state by simply observing their movements and transforming the visual input into motor knowledge to… Click to show full abstract
There is a current claim that humans are able to effortlessly detect others’ hidden mental state by simply observing their movements and transforming the visual input into motor knowledge to predict behaviour. Using a classical paradigm quantifying motor predictions, we tested the role of vision feedback during a reach and load-lifting task performed either alone or with the help of a partner. Wrist flexor and extensor muscle activities were recorded on the supporting hand. Early muscle changes preventing limb instabilities when participants performed the task by themselves revealed the contribution of the visual input in postural anticipation. When the partner performed the unloading, a condition mimicking a split-brain situation, motor prediction followed a pattern evolving along the task course and changing with the integration of successive somatosensory feedback. Our findings demonstrate that during social behaviour, in addition to self-motor representations, individuals cooperate by continuously integrating sensory signals from various sources.
               
Click one of the above tabs to view related content.