We consider a distributed consensus optimization problem over a server-client (federated) network, where all clients are connected to a central server. Current distributed algorithms fail to capture the heterogeneity in… Click to show full abstract
We consider a distributed consensus optimization problem over a server-client (federated) network, where all clients are connected to a central server. Current distributed algorithms fail to capture the heterogeneity in clients' local computation capacities. Motivated by the method of multipliers in centralized optimization, we derive a Newton-type primal-dual method with a distributed implementation utilizing the server-client topology. Then we propose FedHybrid as a hybrid primal-dual method that allows heterogeneous clients to perform different types of updates. Specifically, those clients with higher computational capabilities and/or cheaper costs to perform computation can implement Newton-type updates locally, while other clients can adopt much simpler gradient-type updates. Theoretically, we propose a novel merit function by combining the dual optimality gap and the primal tracking error. We prove that FedHybrid converges linearly to the exact optimal point for strongly convex functions, regardless of clients' choices of gradient-type or Newton-type updates. Finally, we show numerical studies to demonstrate the efficacy of our method in practice. To the best of our knowledge, this is the first hybrid method allowing heterogeneous local updates for distributed consensus optimization with provable convergence and rate guarantees.
               
Click one of the above tabs to view related content.