Robots are expected to operate autonomously in unstructured, real-world environments, for tasks such as locating buried objects in search and rescue applications. When robots operate within opaque granular materials, tactile… Click to show full abstract
Robots are expected to operate autonomously in unstructured, real-world environments, for tasks such as locating buried objects in search and rescue applications. When robots operate within opaque granular materials, tactile and proprioceptive feedback can be more informative than visual feedback. However, since tactile measurements are local and sparse, it can be difficult to efficiently build a global, tactile-based model of a search area. In this work, we developed a framework for tactile perception, mapping, and haptic exploration for the autonomous localization of objects buried in granular materials. Haptic exploration was performed within a densely packed sand mixture using a sensor model that accounts for granular material characteristics and aids in the interpretation of interaction forces between the robot and its environment. The haptic exploration strategy was designed to efficiently locate a buried object and refine its outline while simultaneously minimizing potentially damaging physical interactions with the buried object. Coverage path planning techniques were used to select haptic exploration movements from candidates that aimed to reduce map uncertainty. A continuous occupancy map was generated that fused local, sparse tactile information into a global Bayesian Hilbert Map. We demonstrated our framework in simulation and with a real robot with granular materials.
               
Click one of the above tabs to view related content.