4.7 Article

Federated learning with stochastic quantization

Journal

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS
Volume 37, Issue 12, Pages 11600-11621

Publisher

WILEY-HINDAWI
DOI: 10.1002/int.23056

Keywords

distributed optimization; federated learning; quantization; stochastic gradient descent

Funding

  1. National Natural Science Foundation of China [61976013, 62192784, 62172056]

Ask authors/readers for more resources

This paper investigates the distributed federated learning problem with quantized exchanged information. A novel quantized federated averaging algorithm is proposed and analyzed for both convex and strongly convex loss functions. Extensive experiments using realistic data are conducted to validate the effectiveness of the algorithm.
This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying stochastic quantization scheme to the local and global model parameters. Specifically, the server broadcasts the quantized global model parameter to the workers; the workers update local model parameters using their own data sets and upload the quantized version to the server; then the server updates the global model parameter by aggregating all the quantized local model parameters and its previous global model parameter. This algorithm can be interpreted as a quantized variant of the federated averaging algorithm. The convergence is analyzed theoretically for both convex and strongly convex loss functions with Lipschitz gradient. Extensive experiments using realistic data are provided to show the effectiveness of the proposed algorithm.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available