126 research outputs found

    Special issue on the theory and practice of differential privacy

    Get PDF
    This special issue presents papers based on contributions to the first international workshop on the “Theory and Practice of Differential Privacy” (TPDP) held in London, UK, 18 April 2015, as part of the European joint conference on Theory And Practice of Software (ETAPS). Differential privacy is a mathematically rigorous definition of the privacy protection provided by a data release mechanism: it offers a strong guaranteed bound on what can be learned about a user as a result of participating in a differentially private data analysis. Researchers in differential privacy come from several areas of computer science, including algorithms, programming languages, security, databases and machine learning, as well as from several areas of statistics and data analysis. The workshop was intended to be an occasion for researchers from these different research areas to discuss the recent developments in the theory and practice of differential privacy. The program of the workshop included 10 contributed talks, 1 invited speaker and 1 joint invited speaker with the workshop “Hot Issues in Security Principles and Trust” (HotSpot 2016). Participants at the workshop were invited to submit papers to this special issue. Six papers were accepted, most of which directly reflect talks presented at the workshop

    A Theory AB Toolbox

    Get PDF
    Randomized algorithms are a staple of the theoretical computer science literature. By careful use of randomness, algorithms can achieve properties that are simply not possible with deterministic algorithms. Today, these properties are proved on paper, by theoretical computer scientists; we investigate formally verifying these proofs. The main challenges are two: proofs about algorithms can be quite complex, using various facts from probability theory; and proofs are highly customized - two proofs of the same property for two algorithms can be completely different. To overcome these challenges, we propose taking inspiration from paper proofs, by building common tools - abstractions, reasoning principles, perhaps even notations - into a formal verification toolbox. To give an idea of our approach, we consider three common patterns in paper proofs: the union bound, concentration bounds, and martingale arguments

    Hypothesis Testing Interpretations and Renyi Differential Privacy

    Full text link
    Differential privacy is a de facto standard in data privacy, with applications in the public and private sectors. A way to explain differential privacy, which is particularly appealing to statistician and social scientists is by means of its statistical hypothesis testing interpretation. Informally, one cannot effectively test whether a specific individual has contributed her data by observing the output of a private mechanism---any test cannot have both high significance and high power. In this paper, we identify some conditions under which a privacy definition given in terms of a statistical divergence satisfies a similar interpretation. These conditions are useful to analyze the distinguishability power of divergences and we use them to study the hypothesis testing interpretation of some relaxations of differential privacy based on Renyi divergence. This analysis also results in an improved conversion rule between these definitions and differential privacy

    The Complexity of Verifying Boolean Programs as Differentially Private

    Full text link
    We study the complexity of the problem of verifying differential privacy for while-like programs working over boolean values and making probabilistic choices. Programs in this class can be interpreted into finite-state discrete-time Markov Chains (DTMC). We show that the problem of deciding whether a program is differentially private for specific values of the privacy parameters is PSPACE-complete. To show that this problem is in PSPACE, we adapt classical results about computing hitting probabilities for DTMC. To show PSPACE-hardness we use a reduction from the problem of checking whether a program almost surely terminates or not. We also show that the problem of approximating the privacy parameters that a program provides is PSPACE-hard. Moreover, we investigate the complexity of similar problems also for several relaxations of differential privacy: R\'enyi differential privacy, concentrated differential privacy, and truncated concentrated differential privacy. For these notions, we consider gap-versions of the problem of deciding whether a program is private or not and we show that all of them are PSPACE-complete.Comment: Appeared in CSF 202
    • …
    corecore