LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Multi-Scale U-Shape MLP for Hyperspectral Image Classification

Photo from wikipedia

Hyperspectral images (HSIs) have significant applications in various domains, since they register numerous semantic and spatial information in the spectral band with spatial variability of spectral signatures. Two critical challenges… Click to show full abstract

Hyperspectral images (HSIs) have significant applications in various domains, since they register numerous semantic and spatial information in the spectral band with spatial variability of spectral signatures. Two critical challenges in identifying pixels of the HSI are, respectively, representing the correlated information among the local and global, as well as the abundant parameters of the model. To tackle this challenge, we propose a multi-scale U-shape multi-layer perceptron (MUMLP) a model consisting of the designed multi-scale channel (MSC) block and the U-shape multi-layer perceptron (UMLP) structure. MSC transforms the channel dimension and mixes spectral band feature to embed the deep-level representation adequately. UMLP is designed by the encoder–decoder structure with multi-layer perceptron layers, which is capable of compressing the large-scale parameters. Extensive experiments are conducted to demonstrate that our model can outperform state-of-the-art methods across the board on three wide-adopted public datasets, namely Pavia University (PaviaU), Houston 2013, and Houston 2018.

Keywords: shape mlp; multi scale; multi layer; layer perceptron; scale shape; shape

Journal Title: IEEE Geoscience and Remote Sensing Letters
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.