Mathematical models have been proposed for how the brain interprets sensory information to produce estimates of self-orientation and self-motion. This process, spatial orientation perception, requires dynamically integrating multiple sensory modalities,… Click to show full abstract
Mathematical models have been proposed for how the brain interprets sensory information to produce estimates of self-orientation and self-motion. This process, spatial orientation perception, requires dynamically integrating multiple sensory modalities, including visual, vestibular, and somatosensory cues. Here, we review the progress in mathematical modeling of spatial orientation perception, focusing on dynamic multisensory models, and the experimental paradigms in which they have been validated. These models are primarily "black box" or "as if" models for how the brain processes spatial orientation cues. Yet, they have been effective scientifically, in making quantitative hypotheses that can be empirically assessed, and operationally, in investigating aircraft pilot disorientation, for example. The primary family of models considered, the observer model, implements estimation theory approaches, hypothesizing that internal models (i.e., neural systems replicating the behavior/dynamics of physical systems) are used to produce expected sensory measurements. Expected signals are then compared to actual sensory afference, yielding sensory conflict, which is weighted to drive central perceptions of gravity, angular velocity, and translation. This approach effectively predicts a wide range of experimental scenarios using a small set of fixed free parameters. We conclude with limitations and applications of existing mathematical models and important areas of future work.
               
Click one of the above tabs to view related content.