1,169 research outputs found
The Role of Interactivity in Local Differential Privacy
We study the power of interactivity in local differential privacy. First, we
focus on the difference between fully interactive and sequentially interactive
protocols. Sequentially interactive protocols may query users adaptively in
sequence, but they cannot return to previously queried users. The vast majority
of existing lower bounds for local differential privacy apply only to
sequentially interactive protocols, and before this paper it was not known
whether fully interactive protocols were more powerful. We resolve this
question. First, we classify locally private protocols by their
compositionality, the multiplicative factor by which the sum of a
protocol's single-round privacy parameters exceeds its overall privacy
guarantee. We then show how to efficiently transform any fully interactive
-compositional protocol into an equivalent sequentially interactive protocol
with an blowup in sample complexity. Next, we show that our reduction is
tight by exhibiting a family of problems such that for any , there is a
fully interactive -compositional protocol which solves the problem, while no
sequentially interactive protocol can solve the problem without at least an
factor more examples. We then turn our attention to
hypothesis testing problems. We show that for a large class of compound
hypothesis testing problems --- which include all simple hypothesis testing
problems as a special case --- a simple noninteractive test is optimal among
the class of all (possibly fully interactive) tests
Linear and Range Counting under Metric-based Local Differential Privacy
Local differential privacy (LDP) enables private data sharing and analytics
without the need for a trusted data collector. Error-optimal primitives (for,
e.g., estimating means and item frequencies) under LDP have been well studied.
For analytical tasks such as range queries, however, the best known error bound
is dependent on the domain size of private data, which is potentially
prohibitive. This deficiency is inherent as LDP protects the same level of
indistinguishability between any pair of private data values for each data
downer.
In this paper, we utilize an extension of -LDP called Metric-LDP or
-LDP, where a metric defines heterogeneous privacy guarantees for
different pairs of private data values and thus provides a more flexible knob
than does to relax LDP and tune utility-privacy trade-offs. We show
that, under such privacy relaxations, for analytical workloads such as linear
counting, multi-dimensional range counting queries, and quantile queries, we
can achieve significant gains in utility. In particular, for range queries
under -LDP where the metric is the -distance function scaled by
, we design mechanisms with errors independent on the domain sizes;
instead, their errors depend on the metric , which specifies in what
granularity the private data is protected. We believe that the primitives we
design for -LDP will be useful in developing mechanisms for other analytical
tasks, and encourage the adoption of LDP in practice
- …