9,327 research outputs found
Lossy Compression with Near-uniform Encoder Outputs
It is well known that lossless compression of a discrete memoryless source
with near-uniform encoder output is possible at a rate above its entropy if and
only if the encoder is randomized. This work focuses on deriving conditions for
near-uniform encoder output(s) in the Wyner-Ziv and the distributed lossy
compression problems. We show that in the Wyner-Ziv problem, near-uniform
encoder output and operation close to the WZ-rate limit is simultaneously
possible, whereas in the distributed lossy compression problem, jointly
near-uniform outputs is achievable in the interior of the distributed lossy
compression rate region if the sources share non-trivial G\'{a}cs-K\"{o}rner
common information.Comment: Submitted to the 2016 IEEE International Symposium on Information
Theory (11 Pages, 3 Figures
Universal Sampling Rate Distortion
We examine the coordinated and universal rate-efficient sampling of a subset
of correlated discrete memoryless sources followed by lossy compression of the
sampled sources. The goal is to reconstruct a predesignated subset of sources
within a specified level of distortion. The combined sampling mechanism and
rate distortion code are universal in that they are devised to perform robustly
without exact knowledge of the underlying joint probability distribution of the
sources. In Bayesian as well as nonBayesian settings, single-letter
characterizations are provided for the universal sampling rate distortion
function for fixed-set sampling, independent random sampling and memoryless
random sampling. It is illustrated how these sampling mechanisms are
successively better. Our achievability proofs bring forth new schemes for joint
source distribution-learning and lossy compression
The Likelihood Encoder for Lossy Compression
A likelihood encoder is studied in the context of lossy source compression.
The analysis of the likelihood encoder is based on the soft-covering lemma. It
is demonstrated that the use of a likelihood encoder together with the
soft-covering lemma yields simple achievability proofs for classical source
coding problems. The cases of the point-to-point rate-distortion function, the
rate-distortion function with side information at the decoder (i.e. the
Wyner-Ziv problem), and the multi-terminal source coding inner bound (i.e. the
Berger-Tung problem) are examined in this paper. Furthermore, a non-asymptotic
analysis is used for the point-to-point case to examine the upper bound on the
excess distortion provided by this method. The likelihood encoder is also
related to a recent alternative technique using properties of random binning
Thouless-Anderson-Palmer Approach for Lossy Compression
We study an ill-posed linear inverse problem, where a binary sequence will be
reproduced using a sparce matrix. According to the previous study, this model
can theoretically provide an optimal compression scheme for an arbitrary
distortion level, though the encoding procedure remains an NP-complete problem.
In this paper, we focus on the consistency condition for a dynamics model of
Markov-type to derive an iterative algorithm, following the steps of
Thouless-Anderson-Palmer's. Numerical results show that the algorithm can
empirically saturate the theoretical limit for the sparse construction of our
codes, which also is very close to the rate-distortion function.Comment: 10 pages, 3 figure
- …