Motion effects are a vital component in 4D interactive applications, where special physical effects, such as motion, vibration, and wind, are provided with audiovisual stimuli. In 4D films and VR… Click to show full abstract
Motion effects are a vital component in 4D interactive applications, where special physical effects, such as motion, vibration, and wind, are provided with audiovisual stimuli. In 4D films and VR games, the scenes that show human locomotion appear frequently, and motion effects emphasizing such movements can enhance the viewers’ immersive experiences. This paper proposes a data-driven framework for automatic generation of the motion effects that provide users with walking sensations. Measurements are made using the motion sensors attached to the human body during locomotion in different gaits, e.g., walking, running, and stumping. The captured data are processed and converted to multiple degree-of-freedom commands to a motion platform. We demonstrate that the data-driven motion commands can be represented in a greatly lower-dimensional space by principal component analysis. This finding leads to an algorithm for the synthesis of new motion commands that can elicit the target gait's walking sensations. The perceptual performance of our method is validated by two user studies. This work contributes to investigating the feasibility of mimicking walking sensations using a motion platform based on human locomotion data and developing an automatic generation algorithm of motion effects conveying the impressions of different gaits.
               
Click one of the above tabs to view related content.