In this study, for intention recognition, a convolutional neural network (CNN) classification model using the electromyography (EMG) signals acquired from the subject was developed. For sensory feedback, a rule-based wearable… Click to show full abstract
In this study, for intention recognition, a convolutional neural network (CNN) classification model using the electromyography (EMG) signals acquired from the subject was developed. For sensory feedback, a rule-based wearable proprioceptive feedback haptic device, a new method for providing feedback on the grip information of a robotic prosthesis was proposed. Then, we constructed a closed-loop integrated system consisting of the CNN-based EMG classification model, the proposed haptic device, and a robotic prosthetic hand. Finally, an experiment was conducted in which the closed-loop integrated system was used to simultaneously evaluate the performance of the intention recognition and sensory feedback for a subject. The trained EMG classification model and the proposed haptic device showed the intention recognition and sensory feedback performance with 97% or higher accuracy in 10 grip states. Although some errors occurred in the intention recognition using the EMG classification model, in general, the grip intention of the subject was grasped relatively accurately, and the grip pattern was also accurately transmitted to the subject by the proposed haptic device. The integrated system which consists of the intention recognition using the CNN-based EMG classification model and the sensory feedback using the proposed haptic device is expected to be utilized for robotic prosthetic hand prosthesis control of limb loss participants.
               
Click one of the above tabs to view related content.