research

Lower bounds in differential privacy

Abstract

This is a paper about private data analysis, in which a trusted curator holding a confidential database responds to real vector-valued queries. A common approach to ensuring privacy for the database elements is to add appropriately generated random noise to the answers, releasing only these {\em noisy} responses. In this paper, we investigate various lower bounds on the noise required to maintain different kind of privacy guarantees.Comment: Corrected some minor errors and typos. To appear in Theory of Cryptography Conference (TCC) 201

    Similar works

    Full text

    thumbnail-image

    Available Versions