LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

EC-DNN: A new method for parallel training of deep neural networks

Photo from wikipedia

Abstract Parallelization framework has become a necessity to speed up the training of deep neural networks (DNN) recently. In the typical parallelization framework, called MA-DNN, the parameters of local models… Click to show full abstract

Abstract Parallelization framework has become a necessity to speed up the training of deep neural networks (DNN) recently. In the typical parallelization framework, called MA-DNN, the parameters of local models are periodically averaged to get a global model. However, since DNN is a highly non-convex model, averaging parameters cannot ensure that such global model can perform better than those local models. To tackle this problem, we introduce a new parallelization framework, called EC-DNN. In this framework, we propose to aggregate the local models by the simple ensemble, i.e., averaging the outputs of local models instead of the parameters. As most of prevalent loss functions are convex to the output of DNN, the performance of the global model produced by the simple ensemble is guaranteed to be at least as good as the average performance of local models. To get more performance improvement, we extend the simple ensemble to the generalized ensemble, which produces the global model by the weighted sum instead of the average of the outputs of the local models. However, the model size will explode since each round of ensemble can give rise to multiple times size increment. Thus, we carry out model compression after each ensemble to reduce the size of the global model to be the same as the local ones. Our experimental results show that EC-DNN can achieve better speedup than MA-DNN without loss of accuracy, and there is even accuracy improvement sometimes.

Keywords: training deep; deep neural; neural networks; global model; local models; model

Journal Title: Neurocomputing
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.