4.4 Article

UNIVERSALLY UTILITY-MAXIMIZING PRIVACY MECHANISMS

Journal

SIAM JOURNAL ON COMPUTING
Volume 41, Issue 6, Pages 1673-1693

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/09076828X

Keywords

differential privacy; utility maximization; geometric

Funding

  1. NSF [CCF-0448664]
  2. ONR Young Investigator award
  3. AFOSR MURI
  4. Alfred P. Sloan Fellowship
  5. Stanford Graduate Fellowship
  6. Direct For Computer & Info Scie & Enginr
  7. Division of Computing and Communication Foundations [1016885] Funding Source: National Science Foundation
  8. Direct For Computer & Info Scie & Enginr
  9. Division of Computing and Communication Foundations [1215965] Funding Source: National Science Foundation

Ask authors/readers for more resources

A mechanism for releasing information about a statistical database with sensitive data must resolve a trade-off between utility and privacy. Publishing fully accurate information maximizes utility while minimizing privacy, while publishing random noise accomplishes the opposite. Privacy can be rigorously quantified using the framework of differential privacy, which requires that a mechanism's output distribution is nearly the same whether a given database row is included. The goal of this paper is to formulate and provide strong and general utility guarantees, subject to differential privacy. We pursue mechanisms that guarantee near-optimal utility to every potential user, independent of its side information (modeled as a prior distribution over query results) and preferences (modeled via a symmetric and monotone loss function). Our main result is the following: for each fixed count query and differential privacy level, there is a geometric mechanism M*-a discrete variant of the simple and well-studied mechanism that adds random noise from a Laplace distribution-that is simultaneously expected loss-minimizing for every possible user, subject to the differential privacy constraint. This is an extremely strong utility guarantee: every potential user u, no matter what its side information and preferences, derives as much utility from M* as from interacting with a differentially private mechanism Mu that is optimally tailored to u. More precisely, for every user u there is an optimal mechanism Mu for it that factors into a user-independent part (the geometric mechanism M*) and a user-specific postprocessing step that depends only on the output of the geometric mechanism and not on the underlying database. The first part of our proof of this result characterizes the optimal differentially private mechanism for a user as a certain basic feasible solution to a linear program with a user-specific objective function and user-independent constraints that encode differential privacy. The second part shows that all of the relevant vertices of the feasible region (ranging over all possible users) are derivable from the geometric mechanism via suitable remappings of its range.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available