Abstract For assistive robots to integrate seamlessly into human environments, they are required to understand the intentions of their human partners, and adapt their motion plans accordingly. In this paper,… Click to show full abstract
Abstract For assistive robots to integrate seamlessly into human environments, they are required to understand the intentions of their human partners, and adapt their motion plans accordingly. In this paper, an estimator-controller method is presented to estimate the dynamic motion of the human’s hand and the motion intent, and to learn robot control gains for synchronizing the robot end-effector motion with the human’s hand motion. For human intention estimation, a multiple model estimation framework that switches between multiple nonlinear human motion models is used. An adaptive controller is developed for a robot to track the human’s motion. The controller gains are learned by using data collected by actually performing a collaborative motion task where a human and a robot are collectively moving an object. A controller stability analysis is provided which takes the uncertainty in the human motion estimation in consideration, yielding an UUB bound based on the estimated human motion uncertainty. A case study of the human and robot moving an object is discussed.
               
Click one of the above tabs to view related content.