4.6 Article

FedVAE: Communication-Efficient Federated Learning With Non-IID Private Data

期刊

IEEE SYSTEMS JOURNAL
卷 17, 期 3, 页码 4798-4808

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSYST.2023.3274197

关键词

Anomaly detection; federated learning (FL); healthcare; variational autoencoder (VAE)

向作者/读者索取更多资源

This article proposes FedVAE, an FL framework based on variational autoencoder (VAE) for remote patient monitoring. FedVAE consists of two lightweight VAEs, one for reducing communication overhead and slow convergence rate caused by non-IID data, and the other for eliminating training bias through generating minority class samples. Experimental results show that FedVAE achieves an AUC value of 0.9937, higher than the traditional centralized model (0.9931). Fine-tuning the global model with personalization raises the average AUC to 0.9947. Moreover, FedVAE outperforms vanilla FL by improving AUC by 0.87% while reducing communication traffic by at least 95%.
Federated learning (FL), collaboratively training a shared global model without exchanging and centralizing local data, provides a promising solution for privacy preserving. On the other hand, it is faced with two main challenges: First, high communication cost, and second, low model quality due to imbalanced or nonindependent and identically distributed (non-IID) data. In this article, we propose FedVAE, an FL framework based on variational autoencoder (VAE) for remote patient monitoring. FedVAE contains two lightweight VAEs: one for projecting data onto a lower dimensional space with similar distribution so as to alleviate the issues of excessive communication overhead and slow convergence rate caused by non-IID data, and the other for shunning training bias due to imbalanced data distribution through generating minority class samples. In general, the proposed FedVAE can improve the overall performance of FL models while consuming only a small amount of communication bandwidth. The experimental results show that the area under the curve (AUC) value of FedVAE can reach 0.9937, which is even higher than that of the traditional centralized model (0.9931). Besides, fine-tuning the global model with personalization can raise the average AUC to 0.9947. Moreover, compared with vanilla FL, FedVAE shows 0.87% improvement in AUC while reducing communication traffic by at least 95%.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据