LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Ergodicity reveals assistance and learning from physical human-robot interaction

Photo by alterego_swiss from unsplash

A measure of task information encoded by motion identifies differences between movements that are not captured by standard measures. This paper applies information theoretic principles to the investigation of physical… Click to show full abstract

A measure of task information encoded by motion identifies differences between movements that are not captured by standard measures. This paper applies information theoretic principles to the investigation of physical human-robot interaction. Drawing from the study of human perception and neural encoding, information theoretic approaches offer a perspective that enables quantitatively interpreting the body as an information channel and bodily motion as an information-carrying signal. We show that ergodicity, which can be interpreted as the degree to which a trajectory encodes information about a task, correctly predicts changes due to reduction of a person’s existing deficit or the addition of algorithmic assistance. The measure also captures changes from training with robotic assistance. Other common measures for assessment failed to capture at least one of these effects. This information-based interpretation of motion can be applied broadly, in the evaluation and design of human-machine interactions, in learning by demonstration paradigms, or in human motion analysis.

Keywords: motion; information; assistance; physical human; robot interaction; human robot

Journal Title: Science Robotics
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.