LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Skeleton edge motion networks for human action recognition

Photo from wikipedia

Abstract Human skeleton is receiving increasing attention from the community of human action recognition due to its robustness to complex image backgrounds. Previous methods usually utilize body joint-based representation, i.e.,… Click to show full abstract

Abstract Human skeleton is receiving increasing attention from the community of human action recognition due to its robustness to complex image backgrounds. Previous methods usually utilize body joint-based representation, i.e., joint locations, while leaving edge-based movement poorly investigated. In this paper, we propose a new human action recognition method, skeleton edge motion networks (SEMN), to further explore the motion information of human body parts. Specifically, we address the movement of skeleton edge by using the angle changes of skeleton edge and the movement of the corresponding body joints. We then devise the proposed skeleton edge motion networks by stacking multiple spatial-temporal blocks to learn a robust deep representation from skeleton sequences. Furthermore, we propose a new progressive ranking loss to help the proposed skeleton edge motion networks maintain temporal order information in a self-supervised manner. Experimental results on five popular human action recognition datasets, PennAction, UTD-MHAD, NTU RGB+D, NTU RGB+D 120, and CSL, demonstrate the effectiveness of the proposed method.

Keywords: motion; action recognition; edge; skeleton edge; skeleton; human action

Journal Title: Neurocomputing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.