290 research outputs found
A Universal Scheme for WynerâZiv Coding of Discrete Sources
We consider the WynerâZiv (WZ) problem of lossy compression where the decompressor observes a noisy version of the source, whose statistics are unknown. A new family of WZ coding algorithms is proposed and their universal optimality is proven. Compression consists of sliding-window processing followed by LempelâZiv (LZ) compression, while the decompressor is based on a modification of the discrete universal denoiser (DUDE) algorithm to take advantage of side information. The new algorithms not only universally attain the fundamental limits, but also suggest a paradigm for practical WZ coding. The effectiveness of our approach is illustrated with experiments on binary images, and English text using a low complexity algorithm motivated by our class of universally optimal WZ codes
Side-information Scalable Source Coding
The problem of side-information scalable (SI-scalable) source coding is
considered in this work, where the encoder constructs a progressive
description, such that the receiver with high quality side information will be
able to truncate the bitstream and reconstruct in the rate distortion sense,
while the receiver with low quality side information will have to receive
further data in order to decode. We provide inner and outer bounds for general
discrete memoryless sources. The achievable region is shown to be tight for the
case that either of the decoders requires a lossless reconstruction, as well as
the case with degraded deterministic distortion measures. Furthermore we show
that the gap between the achievable region and the outer bounds can be bounded
by a constant when square error distortion measure is used. The notion of
perfectly scalable coding is introduced as both the stages operate on the
Wyner-Ziv bound, and necessary and sufficient conditions are given for sources
satisfying a mild support condition. Using SI-scalable coding and successive
refinement Wyner-Ziv coding as basic building blocks, a complete
characterization is provided for the important quadratic Gaussian source with
multiple jointly Gaussian side-informations, where the side information quality
does not have to be monotonic along the scalable coding order. Partial result
is provided for the doubly symmetric binary source with Hamming distortion when
the worse side information is a constant, for which one of the outer bound is
strictly tighter than the other one.Comment: 35 pages, submitted to IEEE Transaction on Information Theor
Lossy Compression with Near-uniform Encoder Outputs
It is well known that lossless compression of a discrete memoryless source
with near-uniform encoder output is possible at a rate above its entropy if and
only if the encoder is randomized. This work focuses on deriving conditions for
near-uniform encoder output(s) in the Wyner-Ziv and the distributed lossy
compression problems. We show that in the Wyner-Ziv problem, near-uniform
encoder output and operation close to the WZ-rate limit is simultaneously
possible, whereas in the distributed lossy compression problem, jointly
near-uniform outputs is achievable in the interior of the distributed lossy
compression rate region if the sources share non-trivial G\'{a}cs-K\"{o}rner
common information.Comment: Submitted to the 2016 IEEE International Symposium on Information
Theory (11 Pages, 3 Figures
Multiuser Successive Refinement and Multiple Description Coding
We consider the multiuser successive refinement (MSR) problem, where the
users are connected to a central server via links with different noiseless
capacities, and each user wishes to reconstruct in a successive-refinement
fashion. An achievable region is given for the two-user two-layer case and it
provides the complete rate-distortion region for the Gaussian source under the
MSE distortion measure. The key observation is that this problem includes the
multiple description (MD) problem (with two descriptions) as a subsystem, and
the techniques useful in the MD problem can be extended to this case. We show
that the coding scheme based on the universality of random binning is
sub-optimal, because multiple Gaussian side informations only at the decoders
do incur performance loss, in contrast to the case of single side information
at the decoder. We further show that unlike the single user case, when there
are multiple users, the loss of performance by a multistage coding approach can
be unbounded for the Gaussian source. The result suggests that in such a
setting, the benefit of using successive refinement is not likely to justify
the accompanying performance loss. The MSR problem is also related to the
source coding problem where each decoder has its individual side information,
while the encoder has the complete set of the side informations. The MSR
problem further includes several variations of the MD problem, for which the
specialization of the general result is investigated and the implication is
discussed.Comment: 10 pages, 5 figures. To appear in IEEE Transaction on Information
Theory. References updated and typos correcte
- âŠ