Journal
24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS)
Volume 130, Issue -, Pages -Publisher
MICROTOME PUBLISHING
Keywords
-
Funding
- NIH [R01-GM124111, RF1-AG063481]
- NSF [DMS-1847415, CCF-1763314, CCF-1934876]
- Facebook Faculty Research Award
- Alfred Sloan Research Fellowship
Ask authors/readers for more resources
Federated learning is a training paradigm where clients collaboratively learn models while protecting the privacy of their local sensitive data. This paper introduces federated f-differential privacy and proposes a private federated learning framework PriFedSync which achieves privacy guarantee successfully.
Federated learning (FL) is a training paradigm where the clients collaboratively learn models by repeatedly sharing information without compromising much on the privacy of their local sensitive data. In this paper, we introduce federated f-differential privacy, a new notion specifically tailored to the federated setting, based on the framework of Gaussian differential privacy. Federated f-differential privacy operates on record level : it provides the privacy guarantee on each individual record of one client's data against adversaries. We then propose a generic private federated learning framework PriFedSync that accommodates a large family of state-of-the-art FL algorithms, which provably achieves federated f-differential privacy. Finally, we empirically demonstrate the trade-off between privacy guarantee and prediction performance for models trained by PriFedSync in computer vision tasks.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available