LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Mixed High-Order Non-Local Attention Network for Single Image Super-Resolution

Photo from wikipedia

Attention has been diffusely used in many tasks since it can guide network concentrating on the most important regions of an input pattern. Nevertheless, many advanced works focus on first-order… Click to show full abstract

Attention has been diffusely used in many tasks since it can guide network concentrating on the most important regions of an input pattern. Nevertheless, many advanced works focus on first-order attention design, e.g. channels and spatial attention, but ignore higher-order attention mechanisms. In this work, we propose the Mixed High-Order Attention (MHA) module to model the complex and high-order information in the attention mechanism, which captures the subtle texture and outputs the discriminative attention map. Besides, the region of the convolution is local, which can’t capture global context and long-range dependencies. Therefore, we propose a non-local block to obtain global attention features. We also propose the Mixed High-Order Non-local Attention Network (MHNAN) to improve the richness of attention. Extensive experiments are conducted to demonstrate the superiority of our MHNAN for super-resolution over several state-of-the-art models.

Keywords: order; attention; mixed high; non local; high order; network

Journal Title: IEEE Access
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.