LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

RANSP: Ranking attention network for saliency prediction on omnidirectional images

Photo from wikipedia

Abstract Various convolutional neural network (CNN)-based methods have shown the ability to boost the performance of saliency prediction on omnidirectional images (ODIs). However, these methods are limited by sub-optimal accuracy,… Click to show full abstract

Abstract Various convolutional neural network (CNN)-based methods have shown the ability to boost the performance of saliency prediction on omnidirectional images (ODIs). However, these methods are limited by sub-optimal accuracy, because not all the features extracted by the CNN model are useful for the final fine-grained saliency prediction. Some features are redundant and may have negative impact on the final fine-grained saliency prediction. To tackle this problem, we propose a novel Ranking Attention Network for saliency prediction (RANSP) of head fixations on ODIs. Specifically, the part-guided attention (PA) module and channel-wise feature (CF) extraction module are integrated in a unified framework and are trained in an end-to-end manner for fine-grained saliency prediction. To better utilize the channel-wise feature maps, we further propose a new Ranking Attention Module (RAM), which automatically ranks and selects these feature maps based on scores for fine-grained saliency prediction. Extensive experiments and ablation studies are conducted to show the effectiveness of our method for saliency prediction on ODIs.

Keywords: network; ranking attention; saliency prediction; saliency

Journal Title: Neurocomputing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.