1,659 research outputs found
Privacy-Constrained Remote Source Coding
We consider the problem of revealing/sharing data in an efficient and secure
way via a compact representation. The representation should ensure reliable
reconstruction of the desired features/attributes while still preserve privacy
of the secret parts of the data. The problem is formulated as a remote lossy
source coding with a privacy constraint where the remote source consists of
public and secret parts. Inner and outer bounds for the optimal tradeoff region
of compression rate, distortion, and privacy leakage rate are given and shown
to coincide for some special cases. When specializing the distortion measure to
a logarithmic loss function, the resulting rate-distortion-leakage tradeoff for
the case of identical side information forms an optimization problem which
corresponds to the "secure" version of the so-called information bottleneck.Comment: 10 pages, 1 figure, to be presented at ISIT 201
Lossy Source Coding with Reconstruction Privacy
We consider the problem of lossy source coding with side information under a
privacy constraint that the reconstruction sequence at a decoder should be kept
secret to a certain extent from another terminal such as an eavesdropper, a
sender, or a helper. We are interested in how the reconstruction privacy
constraint at a particular terminal affects the rate-distortion tradeoff. In
this work, we allow the decoder to use a random mapping, and give inner and
outer bounds to the rate-distortion-equivocation region for different cases
where the side information is available non-causally and causally at the
decoder. In the special case where each reconstruction symbol depends only on
the source description and current side information symbol, the complete
rate-distortion-equivocation region is provided. A binary example illustrating
a new tradeoff due to the new privacy constraint, and a gain from the use of a
stochastic decoder is given.Comment: 22 pages, added proofs, to be presented at ISIT 201
Wyner-Ziv Coding over Broadcast Channels: Digital Schemes
This paper addresses lossy transmission of a common source over a broadcast
channel when there is correlated side information at the receivers, with
emphasis on the quadratic Gaussian and binary Hamming cases. A digital scheme
that combines ideas from the lossless version of the problem, i.e.,
Slepian-Wolf coding over broadcast channels, and dirty paper coding, is
presented and analyzed. This scheme uses layered coding where the common layer
information is intended for both receivers and the refinement information is
destined only for one receiver. For the quadratic Gaussian case, a quantity
characterizing the overall quality of each receiver is identified in terms of
channel and side information parameters. It is shown that it is more
advantageous to send the refinement information to the receiver with "better"
overall quality. In the case where all receivers have the same overall quality,
the presented scheme becomes optimal. Unlike its lossless counterpart, however,
the problem eludes a complete characterization
Information Extraction Under Privacy Constraints
A privacy-constrained information extraction problem is considered where for
a pair of correlated discrete random variables governed by a given
joint distribution, an agent observes and wants to convey to a potentially
public user as much information about as possible without compromising the
amount of information revealed about . To this end, the so-called {\em
rate-privacy function} is introduced to quantify the maximal amount of
information (measured in terms of mutual information) that can be extracted
from under a privacy constraint between and the extracted information,
where privacy is measured using either mutual information or maximal
correlation. Properties of the rate-privacy function are analyzed and
information-theoretic and estimation-theoretic interpretations of it are
presented for both the mutual information and maximal correlation privacy
measures. It is also shown that the rate-privacy function admits a closed-form
expression for a large family of joint distributions of . Finally, the
rate-privacy function under the mutual information privacy measure is
considered for the case where has a joint probability density function
by studying the problem where the extracted information is a uniform
quantization of corrupted by additive Gaussian noise. The asymptotic
behavior of the rate-privacy function is studied as the quantization resolution
grows without bound and it is observed that not all of the properties of the
rate-privacy function carry over from the discrete to the continuous case.Comment: 55 pages, 6 figures. Improved the organization and added detailed
literature revie
- …