4.7 Article

Federated Learning With Lossy Distributed Source Coding: Analysis and Optimization

期刊

IEEE TRANSACTIONS ON COMMUNICATIONS
卷 71, 期 8, 页码 4561-4576

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCOMM.2023.3277882

关键词

Federated learning; model aggregation; rate-distortion theory; distributed source coding; Berger-Tung coding; majorization-minimization

向作者/读者索取更多资源

Recently, federated learning (FL) has emerged as an efficient and privacy-friendly machine learning (ML) paradigm. However, the communication cost for model aggregation remains a challenge in FL. This paper proposes a general framework for analyzing the performance of model aggregation based on the rate-distortion theory and establishes a connection between aggregation distortion and FL convergence performance. It also formulates an aggregation distortion minimization problem and develops algorithms for solving it.
Recently, federated learning (FL), which replaces data sharing with model sharing, has emerged as an efficient and privacy-friendly machine learning (ML) paradigm. One of the main challenges in FL is the huge communication cost for model aggregation. Many compression/quantization schemes have been proposed to reduce the communication cost for model aggregation. However, the following question remains unanswered: What is the fundamental trade-off between the communication cost and the FL convergence performance? In this paper, we manage to answer this question. Specifically, we first put forth a general framework for model aggregation performance analysis based on the rate-distortion theory. Under the proposed analysis framework, we derive an inner bound of the rate-distortion region of model aggregation. We then conduct an FL convergence analysis to connect the aggregation distortion and the FL convergence performance. We formulate an aggregation distortion minimization problem to improve the FL convergence performance. Two algorithms are developed to solve the above problem. Numerical results on aggregation distortion, convergence performance, and communication cost demonstrate that the baseline model aggregation schemes still have great potential for further improvement.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据