In practical hyperspectral image cross-scene classification (HSICC) tasks, the arduous work of obtaining labels and the distribution inconsistency caused by spectral shift leave deep learning methods to face great challenges.… Click to show full abstract
In practical hyperspectral image cross-scene classification (HSICC) tasks, the arduous work of obtaining labels and the distribution inconsistency caused by spectral shift leave deep learning methods to face great challenges. Unsupervised domain adaptation aims to exploit knowledge from the annotated source domain and transfer it to the unlabeled target domain, thereby boosting the performance of unsupervised classification. Nevertheless, existing HSICC approaches cannot effectively exploit class structure information from target data. In particular, this article proposes an unsupervised joint adversarial domain adaptation (UJADA) architecture for HSICC to further narrow the distribution gap between distinct domains. The proposed method contains two modules: domain adversarial module that learns domain-invariant features, biclassifier adversarial module that explores task-specific decision boundaries between classes, and both share a feature generator consisting of the dense-based spectral–spatial convolution network. The UJADA simultaneously considers domain- and class-level feature alignment between source and target hyperspectral images in unified adversarial learning processing. Furthermore, the classifier determinacy disparity metric is introduced to fine-grained measure the output probabilistic discrepancy between two task-specific label predictors on target data, thus ensuring the discriminability of transferable features. Comprehensive experiments and ablation studies conducted on two public cross-scene data pairs and our newly acquired ultralow-altitude hyperspectral images under different illumination conditions demonstrate the superior performance of the proposed algorithm, which will greatly promote the practical application of hyperspectral intelligent perception technology.
               
Click one of the above tabs to view related content.