LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

pdlADMM: An ADMM-based framework for parallel deep learning training with efficiency

Photo by patrickltr from unsplash

Abstract Alternating Direction Methods of Multipliers (ADMM) has been proven to be a useful alternative to the popular gradient-based optimizers and successfully applied to train the DNN model. Whereas existing… Click to show full abstract

Abstract Alternating Direction Methods of Multipliers (ADMM) has been proven to be a useful alternative to the popular gradient-based optimizers and successfully applied to train the DNN model. Whereas existing ADMM-based approaches generally do not achieve a good trade-off between the rapid convergence and fast training and do not support parallel DNN training with multiple GPUs as well. These drawbacks seriously hinder them from effectively training DNN models with modern GPU computing platforms which are always equipped with multiple GPUs. In this paper, we propose pdlADMM that can effectively train DNN in a data-parallel manner. The key insight of pdlADMM lies in that it explores efficient solutions for each sub-problem by comprehensively considering three main factors including computational complexity, convergence, and suitability to parallel computing. With more number of GPUs, pdlADMM remains rapid convergence and the computational complexity on each GPU tends to decline. Extensive experiments demonstrate the effectiveness of our proposal. Compared to the other two state state-of-the-art ADMM-based approaches, pdlADMM converges significantly faster, obtains better accuracy, and achieves very competitive training speed at the same time.

Keywords: based framework; parallel deep; framework parallel; pdladmm; admm based; pdladmm admm

Journal Title: Neurocomputing
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.