LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

VMV-GCN: Volumetric Multi-View Based Graph CNN for Event Stream Classification

Photo from wikipedia

Event cameras can perceive pixel-level brightness changes to output asynchronous event streams, and have notable advantages in high temporal resolution, high dynamic range and low power consumption for challenging vision… Click to show full abstract

Event cameras can perceive pixel-level brightness changes to output asynchronous event streams, and have notable advantages in high temporal resolution, high dynamic range and low power consumption for challenging vision tasks. To apply existing learning models on event data, many researchers integrate sparse events into dense frame-based representations which can work with convolutional neural networks directly. Although these works achieve high performance on event-based classification, their models need lots of parameters to process dense event frames which do not fit with the sparsity of event data. To utilize the sparse nature of events, we propose a voxel-wise graph learning model (VMV-GCN) for spatio-temporal feature learning on event streams. Specifically, we design the volumetric multi-view fusion module (VMVF) to extract spatial and temporal information from views of voxelized event data. Then we take representative event voxels as vertices and use a novel dual-graph construction strategy to connect them. By aggregating neighborhood information based on relationships of vertices, the proposed dynamic neighborhood feature learning module (DNFL) can capture discriminative spatio-temporal features on dynamically updated graphs. Experiments show that our method achieves state-of-the-art performance with low model complexity on event-based classification tasks, such as object classification and action recognition.

Keywords: classification; event; graph; multi view; volumetric multi; vmv gcn

Journal Title: IEEE Robotics and Automation Letters
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.