4.7 Article

Federated learning with stochastic quantization

期刊

INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS
卷 37, 期 12, 页码 11600-11621

出版社

WILEY-HINDAWI
DOI: 10.1002/int.23056

关键词

distributed optimization; federated learning; quantization; stochastic gradient descent

资金

  1. National Natural Science Foundation of China [61976013, 62192784, 62172056]

向作者/读者索取更多资源

This paper investigates the distributed federated learning problem with quantized exchanged information. A novel quantized federated averaging algorithm is proposed and analyzed for both convex and strongly convex loss functions. Extensive experiments using realistic data are conducted to validate the effectiveness of the algorithm.
This paper studies the distributed federated learning problem when the exchanged information between the server and the workers is quantized. A novel quantized federated averaging algorithm is developed by applying stochastic quantization scheme to the local and global model parameters. Specifically, the server broadcasts the quantized global model parameter to the workers; the workers update local model parameters using their own data sets and upload the quantized version to the server; then the server updates the global model parameter by aggregating all the quantized local model parameters and its previous global model parameter. This algorithm can be interpreted as a quantized variant of the federated averaging algorithm. The convergence is analyzed theoretically for both convex and strongly convex loss functions with Lipschitz gradient. Extensive experiments using realistic data are provided to show the effectiveness of the proposed algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据