LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Multilevel Adaptive Knowledge Distillation Network for Incremental SAR Target Recognition

Photo by ann10 from unsplash

The existing synthetic aperture radar (SAR) automatic target recognition (ATR) methods have shown impressive results in static scenarios, yet the performance drops sharply as new categories of targets are continuously… Click to show full abstract

The existing synthetic aperture radar (SAR) automatic target recognition (ATR) methods have shown impressive results in static scenarios, yet the performance drops sharply as new categories of targets are continuously increased. In response to this problem, this letter proposes a novel ATR method named multilevel adaptive knowledge distillation network (MLAKDN) to achieve incremental SAR target recognition. To be specific, an adaptive weighted distillation strategy is first proposed, which can alleviate the model from forgetting the knowledge of old categories by distilling multistage soft label information of old categories at the classification level. Then, a feature distillation method based on the gradient maximum criterion is developed to filter and distill discriminative features, so as to further recall more knowledge of old categories at the feature level. Meanwhile, a model rebalancing technique is designed to effectively strike the balance of the model on new and old categories. Finally, a weighted incremental classification loss is presented to train the whole model. Experiments on the moving and stationary target acquisition and recognition (MSTAR) dataset and the synthetic and measured paired labeled experiment (SAMPLE) dataset illustrate that the proposed method is superior to some state-of-the-art methods for incremental SAR target recognition tasks.

Keywords: recognition; distillation; knowledge; target recognition; target; incremental sar

Journal Title: IEEE Geoscience and Remote Sensing Letters
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.