301 research outputs found

    Context-Aware Generative Adversarial Privacy

    Full text link
    Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.Comment: Improved version of a paper accepted by Entropy Journal, Special Issue on Information Theory in Machine Learning and Data Scienc

    Entropy in general physical theories

    Get PDF
    Information plays an important role in our understanding of the physical world. We hence propose an entropic measure of information for any physical theory that admits systems, states and measurements. In the quantum and classical world, our measure reduces to the von Neumann and Shannon entropy respectively. It can even be used in a quantum or classical setting where we are only allowed to perform a limited set of operations. In a world that admits superstrong correlations in the form of non-local boxes, our measure can be used to analyze protocols such as superstrong random access encodings and the violation of `information causality'. However, we also show that in such a world no entropic measure can exhibit all properties we commonly accept in a quantum setting. For example, there exists no`reasonable' measure of conditional entropy that is subadditive. Finally, we prove a coding theorem for some theories that is analogous to the quantum and classical setting, providing us with an appealing operational interpretation.Comment: 20 pages, revtex, 7 figures, v2: Coding theorem revised, published versio

    Robust Privacy-Utility Tradeoffs Under Differential Privacy and Hamming Distortion

    Get PDF
    A privacy-utility tradeoff is developed for an arbitrary set of finite-alphabet source distributions. Privacy is quantified using differential privacy (DP), and utility is quantified using expected Hamming distortion maximized over the set of distributions. The family of source distribution sets (source sets) is categorized into three classes, based on different levels of prior knowledge they capture. For source sets whose convex hull includes the uniform distribution, symmetric DP mechanisms are optimal. For source sets whose probability values have a fixed monotonic ordering, asymmetric DP mechanisms are optimal. For all other source sets, general upper and lower bounds on the optimal privacy leakage are developed and necessary and sufficient conditions for tightness are established. Differentially private leakage is an upper bound on mutual information leakage: the two criteria are compared analytically and numerically to illustrate the effect of adopting a stronger privacy criterion
    • 

    corecore