29 research outputs found

    The Optimal Mechanism in Differential Privacy

    Full text link
    We derive the optimal ϵ\epsilon-differentially private mechanism for single real-valued query function under a very general utility-maximization (or cost-minimization) framework. The class of noise probability distributions in the optimal mechanism has {\em staircase-shaped} probability density functions which are symmetric (around the origin), monotonically decreasing and geometrically decaying. The staircase mechanism can be viewed as a {\em geometric mixture of uniform probability distributions}, providing a simple algorithmic description for the mechanism. Furthermore, the staircase mechanism naturally generalizes to discrete query output settings as well as more abstract settings. We explicitly derive the optimal noise probability distributions with minimum expectation of noise amplitude and power. Comparing the optimal performances with those of the Laplacian mechanism, we show that in the high privacy regime (ϵ\epsilon is small), Laplacian mechanism is asymptotically optimal as ϵ0\epsilon \to 0; in the low privacy regime (ϵ\epsilon is large), the minimum expectation of noise amplitude and minimum noise power are Θ(Δeϵ2)\Theta(\Delta e^{-\frac{\epsilon}{2}}) and Θ(Δ2e2ϵ3)\Theta(\Delta^2 e^{-\frac{2\epsilon}{3}}) as ϵ+\epsilon \to +\infty, while the expectation of noise amplitude and power using the Laplacian mechanism are Δϵ\frac{\Delta}{\epsilon} and 2Δ2ϵ2\frac{2\Delta^2}{\epsilon^2}, where Δ\Delta is the sensitivity of the query function. We conclude that the gains are more pronounced in the low privacy regime.Comment: 40 pages, 5 figures. Part of this work was presented in DIMACS Workshop on Recent Work on Differential Privacy across Computer Science, October 24 - 26, 201

    Differentially Private Convex Optimization with Piecewise Affine Objectives

    Full text link
    Differential privacy is a recently proposed notion of privacy that provides strong privacy guarantees without any assumptions on the adversary. The paper studies the problem of computing a differentially private solution to convex optimization problems whose objective function is piecewise affine. Such problem is motivated by applications in which the affine functions that define the objective function contain sensitive user information. We propose several privacy preserving mechanisms and provide analysis on the trade-offs between optimality and the level of privacy for these mechanisms. Numerical experiments are also presented to evaluate their performance in practice

    Extremal Mechanisms for Local Differential Privacy

    Full text link
    Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where both the data providers and data analysts want to maximize the utility of statistical analyses performed on the released data, we study the fundamental trade-off between local differential privacy and utility. This trade-off is formulated as a constrained optimization problem: maximize utility subject to local differential privacy constraints. We introduce a combinatorial family of extremal privatization mechanisms, which we call staircase mechanisms, and show that it contains the optimal privatization mechanisms for a broad class of information theoretic utilities such as mutual information and ff-divergences. We further prove that for any utility function and any privacy level, solving the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the size of the alphabet the data lives in. To account for this, we show that two simple privatization mechanisms, the binary and randomized response mechanisms, are universally optimal in the low and high privacy regimes, and well approximate the intermediate regime.Comment: 52 pages, 10 figures in JMLR 201

    Development and Analysis of Deterministic Privacy-Preserving Policies Using Non-Stochastic Information Theory

    Get PDF
    A deterministic privacy metric using non-stochastic information theory is developed. Particularly, minimax information is used to construct a measure of information leakage, which is inversely proportional to the measure of privacy. Anyone can submit a query to a trusted agent with access to a non-stochastic uncertain private dataset. Optimal deterministic privacy-preserving policies for responding to the submitted query are computed by maximizing the measure of privacy subject to a constraint on the worst-case quality of the response (i.e., the worst-case difference between the response by the agent and the output of the query computed on the private dataset). The optimal privacy-preserving policy is proved to be a piecewise constant function in the form of a quantization operator applied on the output of the submitted query. The measure of privacy is also used to analyze the performance of kk-anonymity methodology (a popular deterministic mechanism for privacy-preserving release of datasets using suppression and generalization techniques), proving that it is in fact not privacy-preserving.Comment: improved introduction and numerical exampl
    corecore