We propose two novel surrogate measures to predict the validation accuracy of the classification produced by a given neural architecture, thus eliminating the need to train it, in order to… Click to show full abstract
We propose two novel surrogate measures to predict the validation accuracy of the classification produced by a given neural architecture, thus eliminating the need to train it, in order to speed up neural architecture search (NAS). The surrogate measures are based on a solution similarity network, where distance between solutions is measured using the binary encoding of some graph sub-components of the neural architectures. These surrogate measures are implemented within local search and differential evolution algorithms and tested on NAS-Bench-101 and NAS-Bench-301 datasets. The results show that the performance of the similarity-network-based predictors, as measured by correlation between predicted and true accuracy values, are comparable to the state-of-the-art predictors in the literature, however they are significantly faster in achieving these high correlation values for NAS-Bench-101. Furthermore, in some cases, the use of these predictors significantly improves the search performance of the equivalent algorithm (differential evolution or local search) that does not use the predictor.
               
Click one of the above tabs to view related content.