Recent advances in flexible wearable devices have boosted the remarkable development of devices for human–machine interfaces, which are of great value to emerging cybernetics, robotics, and Metaverse systems. However, the… Click to show full abstract
Recent advances in flexible wearable devices have boosted the remarkable development of devices for human–machine interfaces, which are of great value to emerging cybernetics, robotics, and Metaverse systems. However, the effectiveness of existing approaches is limited by the quality of sensor data and classification models with high computational costs. Here, a novel gesture recognition system with triboelectric smart wristbands and an adaptive accelerated learning (AAL) model is proposed. The sensor array is well deployed according to the wrist anatomy and retrieves hand motions from a distance, exhibiting highly sensitive and high‐quality sensing capabilities beyond existing methods. Importantly, the anatomical design leads to the close correspondence between the actions of dominant muscle/tendon groups and gestures, and the resulting distinctive features in sensor signals are very valuable for differentiating gestures with data from 7 sensors. The AAL model realizes a 97.56% identification accuracy in training 21 classes with only one‐third operands of the original neural network. The applications of the system are further exploited in real‐time somatosensory teleoperations with a low latency of <1 s, revealing a new possibility for endowing cyber‐human interactions with disruptive innovation and immersive experience.
               
Click one of the above tabs to view related content.