1,400,603 research outputs found
Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds
"Concentrated differential privacy" was recently introduced by Dwork and
Rothblum as a relaxation of differential privacy, which permits sharper
analyses of many privacy-preserving computations. We present an alternative
formulation of the concept of concentrated differential privacy in terms of the
Renyi divergence between the distributions obtained by running an algorithm on
neighboring inputs. With this reformulation in hand, we prove sharper
quantitative results, establish lower bounds, and raise a few new questions. We
also unify this approach with approximate differential privacy by giving an
appropriate definition of "approximate concentrated differential privacy.
Catalyzing Privacy Law
The United States famously lacks a comprehensive federal data privacy law. In the past year, however, over half the states have proposed broad privacy bills or have established task forces to propose possible privacy legislation. Meanwhile, congressional committees are holding hearings on multiple privacy bills. What is catalyzing this legislative momentum? Some believe that Europe’s General Data Protection Regulation (GDPR), which came into force in 2018, is the driving factor. But with the California Consumer Privacy Act (CCPA) which took effect in January 2020, California has emerged as an alternate contender in the race to set the new standard for privacy.Our close comparison of the GDPR and California’s privacy law reveals that the California law is not GDPR-lite: it retains a fundamentally American approach to information privacy. Reviewing the literature on regulatory competition, we argue that California, not Brussels, is catalyzing privacy law across the United States. And what is happening is not a simple story of powerful state actors. It is more accurately characterized as the result of individual networked norm entrepreneurs, influenced and even empowered by data globalization. Our study helps explain the puzzle of why Europe’s data privacy approach failed to spur US legislation for over two decades. Finally, our study answers critical questions of practical interest to individuals—who will protect my privacy?—and to businesses—whose rules should I follow
Redrawing the Boundaries on Purchasing Data from Privacy-Sensitive Individuals
We prove new positive and negative results concerning the existence of
truthful and individually rational mechanisms for purchasing private data from
individuals with unbounded and sensitive privacy preferences. We strengthen the
impossibility results of Ghosh and Roth (EC 2011) by extending it to a much
wider class of privacy valuations. In particular, these include privacy
valuations that are based on ({\epsilon}, {\delta})-differentially private
mechanisms for non-zero {\delta}, ones where the privacy costs are measured in
a per-database manner (rather than taking the worst case), and ones that do not
depend on the payments made to players (which might not be observable to an
adversary). To bypass this impossibility result, we study a natural special
setting where individuals have mono- tonic privacy valuations, which captures
common contexts where certain values for private data are expected to lead to
higher valuations for privacy (e.g. having a particular disease). We give new
mech- anisms that are individually rational for all players with monotonic
privacy valuations, truthful for all players whose privacy valuations are not
too large, and accurate if there are not too many players with too-large
privacy valuations. We also prove matching lower bounds showing that in some
respects our mechanism cannot be improved significantly
- …
