Data-driven modeling and rendering is a general approach in haptics aiming to provide highly accurate haptic perceptual experiences simulating complex real physical dynamics, such as deformable or textured objects. A… Click to show full abstract
Data-driven modeling and rendering is a general approach in haptics aiming to provide highly accurate haptic perceptual experiences simulating complex real physical dynamics, such as deformable or textured objects. A prevalent problem in the present methods for data-driven haptics is that the computational cost for modeling grows rapidly, even becoming intractable, as the interaction complexity or the number of data increases. This paper proposes one data-driven method featured with greatly improved computational efficiency for modeling viscoelastic deformable objects. This advantage is enabled by the use of fractional derivatives for modeling features and regression forests for data-interpolation models. For the benchmark of normal interaction on deformable objects, we describe a computational framework for data-driven haptic modeling and rendering. Its performance is validated by physical experiments for modeling accuracy and cost and a perceptual experiment for the similarity between real and virtual objects. The experiments demonstrate that our method offers highly realistic haptic perceptual experiences with markedly better modeling cost (at least ten times) than other state-of-the-art methods.
               
Click one of the above tabs to view related content.