LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Image super-resolution via channel attention and spatial attention

Photo from wikipedia

Deep convolutional networks have been widely applied in super-resolution (SR) tasks and have achieved excellent performance. However, even though the self-attention mechanism is a hot topic, has not been applied… Click to show full abstract

Deep convolutional networks have been widely applied in super-resolution (SR) tasks and have achieved excellent performance. However, even though the self-attention mechanism is a hot topic, has not been applied in SR tasks. In this paper, we propose a new attention-based network for more flexible and efficient performance than other generative adversarial network(GAN)-based methods. Specifically, we employ a convolutional block attention module(CBAM) and embed it into a dense block to efficiently exchange information throughout feature maps. Furthermore, we construct our own spatial module with respect to the self-attention mechanism, which not only captures long-distance spatial connections, but also provides more stability for feature extraction. Experimental results demonstrate that our attention-based network improves the performance of visual quality and quantitative evaluations.

Keywords: image super; super resolution; via channel; attention; resolution via

Journal Title: Applied Intelligence
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.