15 research outputs found
Optimal Noise Adding Mechanisms for Approximate Differential Privacy
We study the (nearly) optimal mechanisms in -approximate
differential privacy for integer-valued query functions and vector-valued
(histogram-like) query functions under a utility-maximization/cost-minimization
framework. We characterize the tradeoff between and in
utility and privacy analysis for histogram-like query functions (
sensitivity), and show that the -differential privacy is a
framework not much more general than the -differential privacy
and -differential privacy in the context of and
cost functions, i.e., minimum expected noise magnitude and noise power. In the
same context of and cost functions, we show the
near-optimality of uniform noise mechanism and discrete Laplacian mechanism in
the high privacy regime (as ). We conclude that in
-differential privacy, the optimal noise magnitude and noise
power are and
, respectively, in the
high privacy regime.Comment: 27 pages, 1 figur
Privacy and Utility Tradeoff in Approximate Differential Privacy
We characterize the minimum noise amplitude and power for noise-adding
mechanisms in -differential privacy for single real-valued
query function. We derive new lower bounds using the duality of linear
programming, and new upper bounds by proposing a new class of
-differentially private mechanisms, the \emph{truncated
Laplacian} mechanisms. We show that the multiplicative gap of the lower bounds
and upper bounds goes to zero in various high privacy regimes, proving the
tightness of the lower and upper bounds and thus establishing the optimality of
the truncated Laplacian mechanism. In particular, our results close the
previous constant multiplicative gap in the discrete setting. Numeric
experiments show the improvement of the truncated Laplacian mechanism over the
optimal Gaussian mechanism in all privacy regimes.Comment: 15 pages, 3 figure
Optimal Mechanism for Randomized Responses under Universally Composable Security Measure
We consider a problem of analyzing a global property of private data through
randomized responses subject to a certain rule, where private data are used for
another cryptographic protocol, e.g., authentication. For this problem, the
security of private data was evaluated by a universally composable security
measure, which can be regarded as -differential privacy. Here we
focus on the trade-off between the global accuracy and a universally composable
security measure, and derive an optimal solution to the trade-off problem. More
precisely, we adopt the Fisher information of a certain distribution family as
the estimation accuracy of a global property and impose
-differential privacy on a randomization mechanism protecting
private data. Finally, we maximize the Fisher information under the
-differential privacy constraint and obtain an optimal mechanism
explicitly.Comment: 12 pages, 6 figures; changed the title and revised the abstract,
introduction, and section
Preserving Data-Privacy with Added Noises: Optimal Estimation and Privacy Analysis
Networked system often relies on distributed algorithms to achieve a global
computation goal with iterative local information exchanges between neighbor
nodes. To preserve data privacy, a node may add a random noise to its original
data for information exchange at each iteration. Nevertheless, a neighbor node
can estimate other's original data based on the information it received. The
estimation accuracy and data privacy can be measured in terms of -data-privacy, defined as the probability of -accurate
estimation (the difference of an estimation and the original data is within
) is no larger than (the disclosure probability). How to
optimize the estimation and analyze data privacy is a critical and open issue.
In this paper, a theoretical framework is developed to investigate how to
optimize the estimation of neighbor's original data using the local information
received, named optimal distributed estimation. Then, we study the disclosure
probability under the optimal estimation for data privacy analysis. We further
apply the developed framework to analyze the data privacy of the
privacy-preserving average consensus algorithm and identify the optimal noises
for the algorithm.Comment: 32 pages, 2 figure
More Flexible Differential Privacy: The Application of Piecewise Mixture Distributions in Query Release
There is an increasing demand to make data "open" to third parties, as data
sharing has great benefits in data-driven decision making. However, with a wide
variety of sensitive data collected, protecting privacy of individuals,
communities and organizations, is an essential factor in making data "open".
The approaches currently adopted by industry in releasing private data are
often ad hoc and prone to a number of attacks, including re-identification
attacks, as they do not provide adequate privacy guarantees. While differential
privacy has attracted significant interest from academia and industry by
providing rigorous and reliable privacy guarantees, the reduced utility and
inflexibility of current differentially private algorithms for data release is
a barrier to their use in real-life. This paper aims to address these two
challenges. First, we propose a novel mechanism to augment the conventional
utility of differential privacy by fusing two Laplace or geometric
distributions together. We derive closed form expressions for entropy, variance
of added noise, and absolute expectation of noise for the proposed piecewise
mixtures. Then the relevant distributions are utilised to theoretically prove
the privacy and accuracy guarantees of the proposed mechanisms. Second, we show
that our proposed mechanisms have greater flexibility, with three parameters to
adjust, giving better utility in bounding noise, and mitigating larger
inaccuracy, in comparison to typical one-parameter differentially private
mechanisms. We then empirically evaluate the performance of piecewise mixture
distributions with extensive simulations and with a real-world dataset for both
linear count queries and histogram queries. The empirical results show an
increase in all utility measures considered, while maintaining privacy, for the
piecewise mixture mechanisms compared to standard Laplace or geometric
mechanisms
Performance Evaluation of Differential Privacy Mechanisms in Blockchain based Smart Metering
The concept of differential privacy emerged as a strong notion to protect
database privacy in an untrusted environment. Later on, researchers proposed
several variants of differential privacy in order to preserve privacy in
certain other scenarios, such as real-time cyber physical systems. Since then,
differential privacy has rigorously been applied to certain other domains which
has the need of privacy preservation. One such domain is decentralized
blockchain based smart metering, in which smart meters acting as blockchain
nodes sent their real-time data to grid utility databases for real-time
reporting. This data is further used to carry out statistical tasks, such as
load forecasting, demand response calculation, etc. However, in case if any
intruder gets access to this data it can leak privacy of smart meter users. In
this context, differential privacy can be used to protect privacy of this data.
In this chapter, we carry out comparison of four variants of differential
privacy (Laplace, Gaussian, Uniform, and Geometric) in blockchain based smart
metering scenario. We test these variants on smart metering data and carry out
their performance evaluation by varying different parameters. Experimental
outcomes shows at low privacy budget () and at low reading
sensitivity value (), these privacy preserving mechanisms provide high
privacy by adding large amount of noise. However, among these four privacy
preserving parameters Geometric parameters is more suitable for protecting high
peak values and Laplace mechanism is more suitable for protecting low peak
values at ( = 0.01).Comment: Submitte
Differentially Private Inference for Binomial Data
We derive uniformly most powerful (UMP) tests for simple and one-sided
hypotheses for a population proportion within the framework of Differential
Privacy (DP), optimizing finite sample performance. We show that in general, DP
hypothesis tests can be written in terms of linear constraints, and for
exchangeable data can always be expressed as a function of the empirical
distribution. Using this structure, we prove a 'Neyman-Pearson lemma' for
binomial data under DP, where the DP-UMP only depends on the sample sum. Our
tests can also be stated as a post-processing of a random variable, whose
distribution we coin ''Truncated-Uniform-Laplace'' (Tulap), a generalization of
the Staircase and discrete Laplace distributions. Furthermore, we obtain exact
-values, which are easily computed in terms of the Tulap random variable.
Using the above techniques, we show that our tests can be applied to give
uniformly most accurate one-sided confidence intervals and optimal confidence
distributions. We also derive uniformly most powerful unbiased (UMPU) two-sided
tests, which lead to uniformly most accurate unbiased (UMAU) two-sided
confidence intervals. We show that our results can be applied to
distribution-free hypothesis tests for continuous data. Our simulation results
demonstrate that all our tests have exact type I error, and are more powerful
than current techniques.Comment: 25 pages before references; 39 pages total. 8 figures. arXiv admin
note: text overlap with arXiv:1805.0923
Mathematical comparison of classical and quantum mechanisms in optimization under local differential privacy
Let . An -tuple of probability vectors is
called -differentially private (-DP) if
has no negative entries for all . An
-tuple of density matrices is called classical-quantum
-differentially private (CQ -DP) if
is positive semi-definite for all
. Denote by the set of all
-DP -tuples, and by the set of all
CQ -DP -tuples. By considering optimization problems under
local differential privacy, we define the subset
of that is essentially classical. Roughly
speaking, an element in is the image of
by a completely positive and
trace-preserving linear map (CPTP map). In a preceding study, it is known that
. In this paper, we show
that for every
, and estimate the difference between and
in a certain manner.Comment: 26 page
Differentially Private Uniformly Most Powerful Tests for Binomial Data
We derive uniformly most powerful (UMP) tests for simple and one-sided
hypotheses for a population proportion within the framework of Differential
Privacy (DP), optimizing finite sample performance. We show that in general, DP
hypothesis tests for exchangeable data can always be expressed as a function of
the empirical distribution. Using this structure, we prove a `Neyman-Pearson
lemma' for binomial data under DP, where the DP-UMP only depends on the sample
sum. Our tests can also be stated as a post-processing of a random variable,
whose distribution we coin "Truncated-Uniform-Laplace" (Tulap), a
generalization of the Staircase and discrete Laplace distributions.
Furthermore, we obtain exact p-values, which are easily computed in terms of
the Tulap random variable. We show that our results also apply to
distribution-free hypothesis tests for continuous data. Our simulation results
demonstrate that our tests have exact type I error, and are more powerful than
current techniques.Comment: 15 pages, 2 figure
Cyber-Resilient Transactive Energy System Design over Insecure Communication Links
In this paper, the privacy and security issues associated with transactive
energy systems over insecure communications are addressed. In particular, it is
ensured that, during market-based interactions: (1) each agent's bidding
information remains private; and (2) any extraneous data injection attack can
be easily detected. A unified cryptography-based approach that can
simultaneously achieve both objectives is developed, where privacy preservation
is realized by the Paillier encryption scheme, and attack detection is achieved
by the Paillier digital signature scheme. Simulation results verify the
effectiveness of the proposed cyber-resilient design for transactive energy
systems.Comment: 11 pages, 8 figures, journal submissio