LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

BNNAS++: Towards Unbiased Neural Architecture Search With Batch Normalization

Photo by joelfilip from unsplash

Neural Architecture Search (NAS) achieves significant progress in many computer vision tasks, yet training and searching high-performance architectures over large search space are time-consuming. The NAS with BatchNorm (BNNAS) is… Click to show full abstract

Neural Architecture Search (NAS) achieves significant progress in many computer vision tasks, yet training and searching high-performance architectures over large search space are time-consuming. The NAS with BatchNorm (BNNAS) is an efficient NAS algorithm to speed up the training of supernet ten times over conventional schemes by fixing the convolutional layers and training the BatchNorm only. In this paper, we systematically examine the limitation of the BNNAS and present improved techniques. We observe that the BNNAS is prone to select convolution and fails to find plausible architectures with small FLOPs, especially on the search space with operations other than convolutions. This is because the number of learnable parameters on BatchNorm is imbalanced for different operations, i.e., BatchNorm is only attached to the convolutional layer. To fix the issue, We present the unbiased BNNAS for training. However, our empirical evidence shows that the performance indicator in BNNAS is ineffective on the unbiased BNNAS. To this end, we propose a novel performance indicator, which decomposes the model performance into model expressivity, trainability, and uncertainty, to preserve the rank consistency on the unbiased BNNAS. Our proposed NAS method, named as BNNAS ++, is robust on various search spaces and can efficiently search high-performance architectures with small FLOPs. We validate the effectiveness of BNNAS++ on NAS-Bench-101, NAS-Bench-201, DARTS search space, and MobileNet search space, showing superior performance over existing methods.

Keywords: search space; bnnas; search; neural architecture; architecture search; performance

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.