4.6 Article

Computation and Communication Efficient Adaptive Federated Optimization of Federated Learning for Internet of Things

Journal

ELECTRONICS
Volume 12, Issue 16, Pages -

Publisher

MDPI
DOI: 10.3390/electronics12163451

Keywords

federated learning; distributed machine learning; communication efficiency; privacy protection

Ask authors/readers for more resources

With the proliferation of the Internet of Things (IoT) and the use of devices with sensing, computing, and communication capabilities, intelligent applications empowered by artificial intelligence have become prevalent. However, existing classical artificial intelligence algorithms face challenges in realistic intelligent IoT applications due to data privacy concerns and distributed datasets. To address this, the paper proposes a novel efficient adaptive federated optimization (FedEAFO) algorithm that minimizes learning error by jointly considering local update and parameter compression variables to improve the efficiency of Federated Learning (FL). Experimental results demonstrate that FedEAFO achieves higher accuracies faster compared to state-of-the-art algorithms.
The proliferation of the Internet of Things (IoT) and widespread use of devices with sensing, computing, and communication capabilities have motivated intelligent applications empowered by artificial intelligence. Classical artificial intelligence algorithms require centralized data collection and processing, which are challenging in realistic intelligent IoT applications due to growing data privacy concerns and distributed datasets. Federated Learning (FL) has emerged as a privacy-preserving distributed learning framework, which enables IoT devices to train global models through sharing model parameters. However, inefficiency due to frequent parameter transmissions significantly reduces FL performance. Existing acceleration algorithms consist of two main types including local update and parameter compression, which considers the trade-offs between communication and computation/precision, respectively. Jointly considering these two trade-offs and adaptively balancing their impacts on convergence have remained unresolved. To solve the problem, this paper proposes a novel efficient adaptive federated optimization (FedEAFO) algorithm to improve the efficiency of FL, which minimizes the learning error via jointly considering two variables including local update and parameter compression. The FedEAFO enables FL to adaptively adjust two variables and balance trade-offs among computation, communication, and precision. The experiment results illustrate that compared with state-of-the-art algorithms, the FedEAFO can achieve higher accuracies faster.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available