That which we call private

8 Aug 2019  ·  Úlfar Erlingsson, Ilya Mironov, Ananth Raghunathan, Shuang Song ·

The guarantees of security and privacy defenses are often strengthened by relaxing the assumptions made about attackers or the context in which defenses are deployed. Such relaxations can be a highly worthwhile topic of exploration---even though they typically entail assuming a weaker, less powerful adversary---because there may indeed be great variability in both attackers' powers and their context. However, no weakening or contextual discounting of attackers' power is assumed for what some have called "relaxed definitions" in the analysis of differential-privacy guarantees. Instead, the definitions so named are the basis of refinements and more advanced analyses of the worst-case implications of attackers---without any change assumed in attackers' powers. Because they more precisely bound the worst-case privacy loss, these improved analyses can greatly strengthen the differential-privacy upper-bound guarantees---sometimes lowering the differential-privacy epsilon by orders-of-magnitude. As such, to the casual eye, these analyses may appear to imply a reduced privacy loss. This is a false perception: the privacy loss of any concrete mechanism cannot change with the choice of a worst-case-loss upper-bound analysis technique. Practitioners must be careful not to equate real-world privacy with differential-privacy epsilon values, at least not without full consideration of the context.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here