This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying… Click to show full abstract
This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying stochastic quantization scheme to the local and global model parameters. Specifically, the server broadcasts the quantized global model parameter to the workers; the workers update local model parameters using their own data sets and upload the quantized version to the server; then the server updates the global model parameter by aggregating all the quantized local model parameters and its previous global model parameter. This algorithm can be interpreted as a quantized variant of the federated averaging algorithm. The convergence is analyzed theoretically for both convex and strongly convex loss functions with Lipschitz gradient. Extensive experiments using realistic data are provided to show the effectiveness of the proposed algorithm.
               
Click one of the above tabs to view related content.