4.7 Article

Secure Model-Contrastive Federated Learning With Improved Compressive Sensing

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIFS.2023.3282574

Keywords

Privacy; Costs; Data models; Computational modeling; Compressed sensing; Federated learning; Training; non-IID data; model-contrastive loss; Index Terms; compressive sensing; local differential privacy

Ask authors/readers for more resources

This paper proposes a secure Model-Contrastive Federated Learning with improved Compressive Sensing (MCFL-CS) scheme to address the issues of data heterogeneity and communication burdens in existing methods.
Federated Learning (FL) has been widely used in various fields such as financial risk control, e-government and smart healthcare. To protect data privacy, many privacy-preserving FL approaches have been designed and implemented in various scenarios. However, existing works incur high communication burdens on clients, and affect the training model accuracy due to non-Independently and Identically Distributed (non-IID) data samples separately owned by clients. To solve these issues, in this paper we propose a secure Model-Contrastive Federated Learning with improved Compressive Sensing (MCFL-CS) scheme, motivated by contrastive learning. We combine model-contrastive loss and cross-entropy loss to design the local network architecture of our scheme, which can alleviate the impact of data heterogeneity on model accuracy. Then we utilize improved compressive sensing and local differential privacy to reduce communication costs and prevent clients' privacy leakage. The formal security analysis shows that our scheme satisfies $(\varepsilon,\delta)$ -differential privacy. And extensive experiments using five benchmark datasets demonstrate that our scheme improves the model accuracy by 3.45% on average of all datasets under the non-IID setting and reduces the communication costs by more than 95%, when compared with FedAvg.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available