LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Federated learning with stochastic quantization

Photo from wikipedia

This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying… Click to show full abstract

This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying stochastic quantization scheme to the local and global model parameters. Specifically, the server broadcasts the quantized global model parameter to the workers; the workers update local model parameters using their own data sets and upload the quantized version to the server; then the server updates the global model parameter by aggregating all the quantized local model parameters and its previous global model parameter. This algorithm can be interpreted as a quantized variant of the federated averaging algorithm. The convergence is analyzed theoretically for both convex and strongly convex loss functions with Lipschitz gradient. Extensive experiments using realistic data are provided to show the effectiveness of the proposed algorithm.

Keywords: model parameters; stochastic quantization; federated learning; model; global model

Journal Title: International Journal of Intelligent Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.