4.8 Article

Communication-Efficient Federated Learning Based on Compressed Sensing

期刊

IEEE INTERNET OF THINGS JOURNAL
卷 8, 期 20, 页码 15531-15541

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JIOT.2021.3073112

关键词

Servers; Internet of Things; Quantization (signal); Data models; Signal processing algorithms; Tools; Training data; 1-bit quantization; compressed sensing (CS); federated learning (FL); Internet of Things (IoT)

资金

  1. National Natural Science Foundation of China [61790551, 61925106]

向作者/读者索取更多资源

This article investigates FL in an IoT environment and proposes two new FL algorithms based on CS. Experiments show that these algorithms outperform baseline algorithms.
In this article, we investigate the problem of federated learning (FL) in a communication-constrained environment of the Internet of Things (IoT), where multiple IoT clients train a global model collectively by communicating model updates with a central server instead of sending raw data sets. To ease the communication burden in IoT systems, several approaches have been proposed for the FL tasks, including sparsification methods and data quantization strategies. To overcome the shortcomings of the existing methods, we propose two new FL algorithms based on compressed sensing (CS) referred to as the CS-FL algorithm and the 1-bit CS-FL algorithm, both of which compress the upstream and downstream data while communicating between the clients and the central server. The proposed algorithms improve upon the existing algorithms by letting the clients send analog and 1-bit data, respectively, to the server after compression with a random measurement matrix. Based on that, in CS-FL and 1-bit CS-FL, the clients update the model locally utilizing the result of sparse reconstruction obtained by iterative hard thresholding (IHT) and binary IHT (BIHT), respectively. Experiments conducted on the MNIST and the Fashion-MNIST data sets reveal the superiority of the proposed algorithm over the baseline algorithms, SignSGD with a majority vote, FL based on sparse ternary compression, and FedAvg.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据