Early models of multisensory integration posited that cross-modal signals only converged in higher-order association cortices and that vision automatically dominates. However, recent studies have challenged this view. In this study,… Click to show full abstract
Early models of multisensory integration posited that cross-modal signals only converged in higher-order association cortices and that vision automatically dominates. However, recent studies have challenged this view. In this study, the significance of the alignment of motion axes and spatial alignment across visual and tactile stimuli, as well as the effect of hand visibility on visuo-tactile interactions were examined. Using binocular rivalry, opposed motions were presented to each eye and participants were required to track the perceived visual direction. A tactile motion that was either a leftward or rightward sweep across the fingerpad was intermittently presented. Results showed that tactile effects on visual percepts were dependent on the alignment of motion axes: rivalry between up/down visual motions was not modulated at all by left/right tactile motion. On the other hand, visual percepts could be altered by tactile motion signals when both modalities shared a common axis of motion: a tactile stimulus could maintain the dominance duration of a congruent visual stimulus and shorten its suppression period. The effects were also conditional on the spatial alignment of the visual and tactile stimuli, being eliminated when the tactile device was displaced 15 cm away to the right of the visual stimulus. In contrast, visibility of the hand touching the tactile stimulus facilitated congruent switches relative to a visual-only baseline but did not present a significant advantage overall. In sum, these results show a low-level sensory interaction that is conditional on visual and tactile stimuli sharing a common motion axis and location in space.
               
Click one of the above tabs to view related content.