LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Mine-Distill-Prototypes for Complete Few-Shot Class-Incremental Learning in Image Classification

Photo from wikipedia

Recently, few-shot learning (FSL) has received increasing attention because of difficulties in sample collection in some application scenarios, such as maritime surveillance using synthetic aperture radar (SAR) or infrared images.… Click to show full abstract

Recently, few-shot learning (FSL) has received increasing attention because of difficulties in sample collection in some application scenarios, such as maritime surveillance using synthetic aperture radar (SAR) or infrared images. In real situations of such scenarios, it is a common requirement that the model can recognize novel classes incrementally, namely class-incremental learning (CIL). Considering the above requirement, a novel problem that recognizes novel classes incrementally when both the base and novel class samples are scarce is proposed in this article. It is called complete few-shot CIL (C-FSCIL) for distinguishing from the FSCIL that assumes sufficient samples of base classes. Specifically, the following challenges of C-FSCIL are focused on: 1) distance measurement is used for recognizing novel classes incrementally, but the encoder is difficult to be learned well when base class samples are scarce, making some features unsuitable for calculating the distance, decreasing the performance and 2) the catastrophic forgetting problem becomes more difficult to be alleviated than that in FSCIL because of the scarcity of base class samples. To tackle both challenges, mine-distill-prototypes (MDP) algorithm is proposed, which consists of two parts: 1) prototypes-distillation (PD) network is proposed to learn to distill the features and prototypes into a lower dimensional in which ineffective features are eliminated and 2) the prototypes-weight (PW) network and the prototypes-selection (PS) training strategy are proposed for the catastrophic forgetting problem, which aims to capture the relationship between the base and novel prototypes. The superior performance of the proposed algorithm is demonstrated by the experiments on three datasets.

Keywords: complete shot; distill; class; incremental learning; mine distill; class incremental

Journal Title: IEEE Transactions on Geoscience and Remote Sensing
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.