LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Achieving geometric convergence for distributed optimization with Barzilai-Borwein step sizes

Photo from wikipedia

We consider a distributed multi-agent optimization problem over a time-invariant undirected graph, where each agent possesses a local objective function and all agents collaboratively minimize the average of all objective… Click to show full abstract

We consider a distributed multi-agent optimization problem over a time-invariant undirected graph, where each agent possesses a local objective function and all agents collaboratively minimize the average of all objective functions through local computations and communications among neighbors. Recently, a class of distributed gradient methods has been proposed that achieves both exact and geometric convergence when a constant step size is used. The geometric convergence of these methods is ensured for conservatively selected step sizes, but how to choose an appropriate step size while running the algorithms has not been fully addressed. The Barzilai-Borwein (BB) method is a simple and effective technique for step sizes and requires few storage and inexpensive computations. It has been widely applied in various areas. In this paper, we introduce the BB method to distributed optimization. Based on an adapt-then-combine variation of the dynamic average consensus approach and using multi-consensus inner loops, we propose a distributed gradient method with BB step sizes (DGM-BB-C). Our method computes the step size for each agent automatically which only depends on its local information and is independent of that for other agents, and the larger step sizes are always permissible. Our method can seek the exact optimum when the number of consensus steps stays constant. We prove that DGM-BB-C has geometric convergence to the optimal solution. Simulation results on a distributed sensing problem show that our method is superior to some advanced methods in terms of iterations, gradient evaluations, communications and the related cost framework. These results validate our theoretical discoveries.

Keywords: step sizes; step; geometric convergence; method; optimization

Journal Title: Science China Information Sciences
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.