64,679 research outputs found
Source Coding with Fixed Lag Side Information
We consider source coding with fixed lag side information at the decoder. We
focus on the special case of perfect side information with unit lag
corresponding to source coding with feedforward (the dual of channel coding
with feedback) introduced by Pradhan. We use this duality to develop a linear
complexity algorithm which achieves the rate-distortion bound for any
memoryless finite alphabet source and distortion measure.Comment: 10 pages, 3 figure
Side-information Scalable Source Coding
The problem of side-information scalable (SI-scalable) source coding is
considered in this work, where the encoder constructs a progressive
description, such that the receiver with high quality side information will be
able to truncate the bitstream and reconstruct in the rate distortion sense,
while the receiver with low quality side information will have to receive
further data in order to decode. We provide inner and outer bounds for general
discrete memoryless sources. The achievable region is shown to be tight for the
case that either of the decoders requires a lossless reconstruction, as well as
the case with degraded deterministic distortion measures. Furthermore we show
that the gap between the achievable region and the outer bounds can be bounded
by a constant when square error distortion measure is used. The notion of
perfectly scalable coding is introduced as both the stages operate on the
Wyner-Ziv bound, and necessary and sufficient conditions are given for sources
satisfying a mild support condition. Using SI-scalable coding and successive
refinement Wyner-Ziv coding as basic building blocks, a complete
characterization is provided for the important quadratic Gaussian source with
multiple jointly Gaussian side-informations, where the side information quality
does not have to be monotonic along the scalable coding order. Partial result
is provided for the doubly symmetric binary source with Hamming distortion when
the worse side information is a constant, for which one of the outer bound is
strictly tighter than the other one.Comment: 35 pages, submitted to IEEE Transaction on Information Theor
Interactive Relay Assisted Source Coding
This paper investigates a source coding problem in which two terminals
communicating through a relay wish to estimate one another's source within some
distortion constraint. The relay has access to side information that is
correlated with the sources. Two different schemes based on the order of
communication, \emph{distributed source coding/delivery} and \emph{two cascaded
rounds}, are proposed and inner and outer bounds for the resulting
rate-distortion regions are provided. Examples are provided to show that
neither rate-distortion region includes the other one.Comment: Invited Paper submitted to GlobalSIP: IEEE Global Conference on
Signal and Information Processing 201
Secure Multiterminal Source Coding with Side Information at the Eavesdropper
The problem of secure multiterminal source coding with side information at
the eavesdropper is investigated. This scenario consists of a main encoder
(referred to as Alice) that wishes to compress a single source but
simultaneously satisfying the desired requirements on the distortion level at a
legitimate receiver (referred to as Bob) and the equivocation rate --average
uncertainty-- at an eavesdropper (referred to as Eve). It is further assumed
the presence of a (public) rate-limited link between Alice and Bob. In this
setting, Eve perfectly observes the information bits sent by Alice to Bob and
has also access to a correlated source which can be used as side information. A
second encoder (referred to as Charlie) helps Bob in estimating Alice's source
by sending a compressed version of its own correlated observation via a
(private) rate-limited link, which is only observed by Bob. For instance, the
problem at hands can be seen as the unification between the Berger-Tung and the
secure source coding setups. Inner and outer bounds on the so called
rates-distortion-equivocation region are derived. The inner region turns to be
tight for two cases: (i) uncoded side information at Bob and (ii) lossless
reconstruction of both sources at Bob --secure distributed lossless
compression. Application examples to secure lossy source coding of Gaussian and
binary sources in the presence of Gaussian and binary/ternary (resp.) side
informations are also considered. Optimal coding schemes are characterized for
some cases of interest where the statistical differences between the side
information at the decoders and the presence of a non-zero distortion at Bob
can be fully exploited to guarantee secrecy.Comment: 26 pages, 16 figures, 2 table
- …