LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework.

Photo from wikipedia

Neural architecture search (NAS) depends heavily on an efficient and accurate performance estimator. To speed up the evaluation process, recent advances, like differentiable architecture search (DARTS) and One-Shot approaches, instead… Click to show full abstract

Neural architecture search (NAS) depends heavily on an efficient and accurate performance estimator. To speed up the evaluation process, recent advances, like differentiable architecture search (DARTS) and One-Shot approaches, instead of training every model from scratch, train a weight-sharing super-network to reuse parameters among different candidates, in which all child models can be efficiently evaluated. Though these methods significantly boost search efficiency, they inherently suffer from inaccurate and unstable performance estimation. To this end, we propose a general and effective framework for powering weight-sharing NAS, namely, PWSNAS, by shrinking search space automatically, i.e., candidate operators will be discarded if they are less important. With the strategy, our approach can provide a promising search space of a smaller size by progressively simplifying the original search space, which can reduce difficulties for existing NAS methods to find superior architectures. In particular, we present two strategies to guide the shrinking process: detect redundant operators with a new angle-based metric and decrease the degree of weight sharing of a super-network by increasing parameters, which differentiates PWSNAS from existing shrinking methods. Comprehensive analysis experiments on NASBench-201 verify the superiority of our proposed metric over existing accuracy-based and magnitude-based metrics. PWSNAS can easily apply to the state-of-the-art NAS methods, e.g., single path one-shot neural architecture search (SPOS), FairNAS, ProxylessNAS, DARTS, and progressive DARTS (PDARTS). We evaluate PWSNAS and demonstrate consistent performance gains over baseline methods.

Keywords: weight sharing; sharing nas; search space; powering weight; search

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.