LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Autonomous Runtime Composition of Sensor-Based Skills Using Concurrent Task Planning

Photo from wikipedia

Constraint-based robot programming allows for implementing sensor-based skills that react to disturbances on the one hand, and composable skills that dynamically create and reconfigure complex robot behaviors on the other… Click to show full abstract

Constraint-based robot programming allows for implementing sensor-based skills that react to disturbances on the one hand, and composable skills that dynamically create and reconfigure complex robot behaviors on the other hand. In this letter, we address a class of problems where composing appropriate skills prior to execution is inconvenient due to unpredictable events of disturbances. To this end, we propose an autonomous replanning and acting framework that computes and executes reactive composed skills at runtime. We formulate a model of constraint-based skills and present methods for executing concurrent skill compositions at runtime without interrupting the ongoing task execution. In order to autonomously compute reactive composed skills at runtime, we propose a temporal description of planning actions, corresponding to robot skills, that ensures concurrency at the task planning level. We validate our work on a dual-arm robot system that performs an industrial assembly task while reacting to obstacle and human-induced disturbances at runtime.

Keywords: task planning; task; robot; sensor based; based skills

Journal Title: IEEE Robotics and Automation Letters
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.