Neural architecture search (NAS) has been widely studied to design high-performance network architectures automatically. However, existing approaches require more search time and substantial resource consumption due to their intensive architecture… Click to show full abstract
Neural architecture search (NAS) has been widely studied to design high-performance network architectures automatically. However, existing approaches require more search time and substantial resource consumption due to their intensive architecture evaluations. Moreover, recently developed NAS algorithms are noncompetitive when combining multiple competing and conflicting objectives, e.g., the test accuracy and the number of parameters. In this paper, a low-cost NAS (LoNAS) method is proposed to address these problems. First, a variable-architecture encoding strategy based on a novel Reg Block is designed to construct high accuracy network architectures with few parameters. Second, a training-free proxy based on the neural tangent kernel (NTK) is proposed to accelerate the search process efficiently. Finally, a three-stage evolutionary algorithm (EA) based on multiple-criteria environmental selection and a set of block-based mutation operators are designed to balance exploration and exploitation better. The experimental results show that LoNAS finds network architectures with competitive performance compared to the state-of-the-art architectures in test accuracy and the number of parameters. Moreover, LoNAS uses less search time and fewer computational resources, consuming only 0.02 GPU Days with one GPU on CIFAR-10 and CIFAR-100. Furthermore, the architectures found by LoNAS on CIFAR-10 and CIFAR-100 exhibit good transferability to ImageNet-16-120, with the test accuracy surpassing that of the state-of-the-art network architectures.
               
Click one of the above tabs to view related content.