77 research outputs found

    Information Extraction Under Privacy Constraints

    Full text link
    A privacy-constrained information extraction problem is considered where for a pair of correlated discrete random variables (X,Y)(X,Y) governed by a given joint distribution, an agent observes YY and wants to convey to a potentially public user as much information about YY as possible without compromising the amount of information revealed about XX. To this end, the so-called {\em rate-privacy function} is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from YY under a privacy constraint between XX and the extracted information, where privacy is measured using either mutual information or maximal correlation. Properties of the rate-privacy function are analyzed and information-theoretic and estimation-theoretic interpretations of it are presented for both the mutual information and maximal correlation privacy measures. It is also shown that the rate-privacy function admits a closed-form expression for a large family of joint distributions of (X,Y)(X,Y). Finally, the rate-privacy function under the mutual information privacy measure is considered for the case where (X,Y)(X,Y) has a joint probability density function by studying the problem where the extracted information is a uniform quantization of YY corrupted by additive Gaussian noise. The asymptotic behavior of the rate-privacy function is studied as the quantization resolution grows without bound and it is observed that not all of the properties of the rate-privacy function carry over from the discrete to the continuous case.Comment: 55 pages, 6 figures. Improved the organization and added detailed literature revie
    • …
    corecore