LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Distributed Nesterov Gradient Methods Over Arbitrary Graphs

Photo by codioful from unsplash

In this letter, we introduce a distributed Nesterov gradient method, $\mathcal {ABN}$, that does not require doubly stochastic weights. Instead, the implementation is based on a simultaneous application of both row-… Click to show full abstract

In this letter, we introduce a distributed Nesterov gradient method, $\mathcal {ABN}$, that does not require doubly stochastic weights. Instead, the implementation is based on a simultaneous application of both row- and column-stochastic weights that makes $\mathcal {ABN}$ applicable to arbitrary (strongly-connected) graphs. Since constructing column-stochastic weights needs additional information (the number of outgoing neighbors), not available in certain communication protocols, we derive a variation, FROZEN, that only requires row-stochastic weights, but at the expense of additional iterations for eigenvector estimation. We numerically study these algorithms for various objective functions and network parameters and show that the proposed distributed Nesterov gradient methods achieve acceleration compared to the current state-of-the-art methods for distributed optimization.

Keywords: nesterov gradient; tex math; distributed nesterov; stochastic weights; inline formula

Journal Title: IEEE Signal Processing Letters
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.