3.8 Proceedings Paper

Federated f-Differential Privacy

Publisher

MICROTOME PUBLISHING

Keywords

-

Funding

  1. NIH [R01-GM124111, RF1-AG063481]
  2. NSF [DMS-1847415, CCF-1763314, CCF-1934876]
  3. Facebook Faculty Research Award
  4. Alfred Sloan Research Fellowship

Ask authors/readers for more resources

Federated learning is a training paradigm where clients collaboratively learn models while protecting the privacy of their local sensitive data. This paper introduces federated f-differential privacy and proposes a private federated learning framework PriFedSync which achieves privacy guarantee successfully.
Federated learning (FL) is a training paradigm where the clients collaboratively learn models by repeatedly sharing information without compromising much on the privacy of their local sensitive data. In this paper, we introduce federated f-differential privacy, a new notion specifically tailored to the federated setting, based on the framework of Gaussian differential privacy. Federated f-differential privacy operates on record level : it provides the privacy guarantee on each individual record of one client's data against adversaries. We then propose a generic private federated learning framework PriFedSync that accommodates a large family of state-of-the-art FL algorithms, which provably achieves federated f-differential privacy. Finally, we empirically demonstrate the trade-off between privacy guarantee and prediction performance for models trained by PriFedSync in computer vision tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available