4.7 Article

Resource-Aware Knowledge Distillation for Federated Learning

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TETC.2023.3252600

关键词

Federated learning; heterogeneous re-source; knowledge distillation

向作者/读者索取更多资源

This article introduces a knowledge transfer-based federated learning framework under a resource-limited distributed system to address the challenges of federated learning in IoT systems. The proposed approach uses knowledge distillation to improve efficiency and outperform other schemes.
The rise of deep learning and the Internet of Things (IoT) has driven a number of smart-world applications, which are mostly deployed in distributed environments. Federated learning, a privacy-preserving collaborative learning paradigm, has shown considerable potential to leverage the rich distributed data at network edges. Nonetheless, the heterogeneity of IoT devices and their connected network environment impedes federated learning applications in IoT systems. Particularly, the stale gradients updated by slower local learners impact the effectiveness of federated learning. Transmitting weight updates with a large number of users leads to network congestion at the edge of the network and incurs unaffordable communication costs. To overcome these challenges, we propose a transfer knowledge based federated learning framework under a resource-limited distributed system. We formulate a knowledge distillation based federated learning optimization problem with the consideration of dynamic local resource. The proposed approach carries out federated learning with the help of knowledge distillation to avoid occupying the expensive network bandwidth or bringing a heavy burden to the network. Theoretical analysis demonstrates convergence of the learning process. The experimental results on three public datasets illustrate that the proposed framework is capable of substantially improving the efficiency of federated learning and outperforming state-of-the-art schemes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据