LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Learning to Learn Task-Adaptive Hyperparameters for Few-Shot Learning.

Photo by nhoizey from unsplash

The objective of few-shot learning is to design a system that can adapt to a given task with only few examples while achieving generalization. Model-agnostic meta-learning (MAML), which has recently… Click to show full abstract

The objective of few-shot learning is to design a system that can adapt to a given task with only few examples while achieving generalization. Model-agnostic meta-learning (MAML), which has recently gained the popularity for its simplicity and flexibility, learns a good initialization for fast adaptation to a task under few-data regime. However, its performance has been relatively limited especially when novel tasks are different from tasks previously seen during training. In this work, instead of searching for a better initialization, we focus on designing a better fast adaptation process. Consequently, we propose a new task-adaptive weight update rule that greatly enhances the fast adaptation process. Specifically, we introduce a small meta-network that can generate per-step hyperparameters for each given task: learning rate and weight decay coefficients. The experimental results validate that learning a good weight update rule for fast adaptation is the equally important component that has drawn relatively less attention in the recent few-shot learning approaches. Surprisingly, fast adaptation from random initialization with ALFA can already outperform MAML. Furthermore, the proposed weight-update rule is shown to consistently improve the task-adaptation capability of MAML across diverse problem domains: few-shot classification, cross-domain few-shot classification, regression, visual tracking, and video frame interpolation.

Keywords: adaptation; task adaptive; fast adaptation; task; shot learning; weight

Journal Title: IEEE transactions on pattern analysis and machine intelligence
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.