In this paper, a novel vision-based measurement (VBM) approach is proposed to estimate the contact force and classify materials in a single grasp. This approach is the first event-based tactile… Click to show full abstract
In this paper, a novel vision-based measurement (VBM) approach is proposed to estimate the contact force and classify materials in a single grasp. This approach is the first event-based tactile sensor which utilizes the recent technology of neuromorphic cameras. This novel approach provides higher sensitivity, a lower latency, and less computational and power consumption compared to other conventional vision-based techniques. Moreover, the dynamic vision sensor (DVS) has a higher dynamic range which increases the sensor sensitivity and performance in poor lighting conditions. Two time-series machine learning methods, namely, time delay neural network (TDNN) and Gaussian process (GP) are developed to estimate the contact force in a grasp. A deep neural network (DNN) is proposed to classify the object materials. Forty-eight experiments are conducted for four different materials to validate the proposed methods and compare them against a piezoresistive force sensor measurements. A leave-one-out cross-validation technique is implemented to evaluate and analyze the performance of the proposed machine learning methods. The contact force is successfully estimated with a mean squared error of 0.16 and 0.17 N for TDNN and GP, respectively. Four materials are classified with an average accuracy of 79.17% using unseen experimental data. The results show the applicability of event-based sensors for grasping applications.
               
Click one of the above tabs to view related content.