13,098 research outputs found
Nonasymptotic noisy lossy source coding
This paper shows new general nonasymptotic achievability and converse bounds
and performs their dispersion analysis for the lossy compression problem in
which the compressor observes the source through a noisy channel. While this
problem is asymptotically equivalent to a noiseless lossy source coding problem
with a modified distortion function, nonasymptotically there is a noticeable
gap in how fast their minimum achievable coding rates approach the common
rate-distortion function, as evidenced both by the refined asymptotic analysis
(dispersion) and the numerical results. The size of the gap between the
dispersions of the noisy problem and the asymptotically equivalent noiseless
problem depends on the stochastic variability of the channel through which the
compressor observes the source.Comment: IEEE Transactions on Information Theory, 201
The Three-Terminal Interactive Lossy Source Coding Problem
The three-node multiterminal lossy source coding problem is investigated. We
derive an inner bound to the general rate-distortion region of this problem
which is a natural extension of the seminal work by Kaspi'85 on the interactive
two-terminal source coding problem. It is shown that this (rather involved)
inner bound contains several rate-distortion regions of some relevant source
coding settings. In this way, besides the non-trivial extension of the
interactive two terminal problem, our results can be seen as a generalization
and hence unification of several previous works in the field. Specializing to
particular cases we obtain novel rate-distortion regions for several lossy
source coding problems. We finish by describing some of the open problems and
challenges. However, the general three-node multiterminal lossy source coding
problem seems to offer a formidable mathematical complexity.Comment: New version with changes suggested by reviewers.Revised and
resubmitted to IEEE Transactions on Information Theory. 92 pages, 11 figures,
1 tabl
Critical Behavior in Lossy Source Coding
The following critical phenomenon was recently discovered. When a memoryless
source is compressed using a variable-length fixed-distortion code, the fastest
convergence rate of the (pointwise) compression ratio to the optimal
bits/symbol is either or . We show it is always
, except for discrete, uniformly distributed sources.Comment: 2 figure
Lossy Source Coding with Reconstruction Privacy
We consider the problem of lossy source coding with side information under a
privacy constraint that the reconstruction sequence at a decoder should be kept
secret to a certain extent from another terminal such as an eavesdropper, a
sender, or a helper. We are interested in how the reconstruction privacy
constraint at a particular terminal affects the rate-distortion tradeoff. In
this work, we allow the decoder to use a random mapping, and give inner and
outer bounds to the rate-distortion-equivocation region for different cases
where the side information is available non-causally and causally at the
decoder. In the special case where each reconstruction symbol depends only on
the source description and current side information symbol, the complete
rate-distortion-equivocation region is provided. A binary example illustrating
a new tradeoff due to the new privacy constraint, and a gain from the use of a
stochastic decoder is given.Comment: 22 pages, added proofs, to be presented at ISIT 201
Lossy Source Coding via Spatially Coupled LDGM Ensembles
We study a new encoding scheme for lossy source compression based on
spatially coupled low-density generator-matrix codes. We develop a
belief-propagation guided-decimation algorithm, and show that this algorithm
allows to approach the optimal distortion of spatially coupled ensembles.
Moreover, using the survey propagation formalism, we also observe that the
optimal distortions of the spatially coupled and individual code ensembles are
the same. Since regular low-density generator-matrix codes are known to achieve
the Shannon rate-distortion bound under optimal encoding as the degrees grow,
our results suggest that spatial coupling can be used to reach the
rate-distortion bound, under a {\it low complexity} belief-propagation
guided-decimation algorithm.
This problem is analogous to the MAX-XORSAT problem in computer science.Comment: Submitted to ISIT 201
Second-Order Coding Rates for Conditional Rate-Distortion
This paper characterizes the second-order coding rates for lossy source
coding with side information available at both the encoder and the decoder. We
first provide non-asymptotic bounds for this problem and then specialize the
non-asymptotic bounds for three different scenarios: discrete memoryless
sources, Gaussian sources, and Markov sources. We obtain the second-order
coding rates for these settings. It is interesting to observe that the
second-order coding rate for Gaussian source coding with Gaussian side
information available at both the encoder and the decoder is the same as that
for Gaussian source coding without side information. Furthermore, regardless of
the variance of the side information, the dispersion is nats squared per
source symbol.Comment: 20 pages, 2 figures, second-order coding rates, finite blocklength,
network information theor
- âŠ