LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Hand gesture recognition from wrist-worn camera for human-machine interaction

Photo from wikipedia

In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream… Click to show full abstract

In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream of hand gestures. Then we built a new wrist-worn gesture dataset (named WiGes) with various subjects in interaction with home appliances in different environments. To the best of our knowledge, this is the first benchmark released for studying hand gestures from a wrist-worn camera. We then evaluate various CNN models for vision-based recognition. Furthermore, we deeply analyze the models that produce the best trade-off between accuracy, memory requirement, and computational cost. We point out that among studied architectures, MoviNet produces the highest accuracy. Then, we introduce a new MoviNet-based two-stream architecture that takes both RGB and optical flow into account. Our proposed architecture increases the Top-1 accuracy by 1.36% and 3.67% according to two evaluation protocols. Our dataset, baselines, and proposed model analysis give instructive recommendations for human-machine interaction using hand-held devices.

Keywords: wrist worn; machine interaction; human machine; hand

Journal Title: IEEE Access
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.