LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Proxyless Neural Architecture Adaptation at Once

Photo by joakimnadell from unsplash

Recently, Neural Architecture Search (NAS) methods are introduced and show impressive performance on many benchmarks. Among those NAS studies, Neural Architecture Transformer (NAT) aims to adapt the given neural architecture… Click to show full abstract

Recently, Neural Architecture Search (NAS) methods are introduced and show impressive performance on many benchmarks. Among those NAS studies, Neural Architecture Transformer (NAT) aims to adapt the given neural architecture to improve performance while maintaining computational costs. In the architecture adaptation task, we can utilize the known high-performance architectures, and the architecture adaptation results of NAT showed performance improvements on various architectures in their experiments. However, we verified that NAT lacks reproducibility through multiple trials of experiments. Moreover, it requires an additional architecture adaptation process before network weight training. In this paper, we propose proxyless neural architecture adaptation that is reproducible and efficient. The proposed method doesn’t need a proxy task for architecture adaptation. It directly improves the architecture during the conventional training process, and we can directly use the trained neural network. Moreover, the proposed method can be applied to both supervised learning and self-supervised learning. The proposed method shows stable performance improvements on various architectures and various datasets. Extensive experiments on two benchmark datasets, i.e., CIFAR-10 and Tiny Imagenet, present that the proposed method definitely outperforms NAT and be applicable to various models and datasets.

Keywords: architecture adaptation; architecture; neural architecture; performance; proxyless neural

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.