69 research outputs found
Joint Wyner-Ziv/Dirty Paper coding by modulo-lattice modulation
The combination of source coding with decoder side-information (Wyner-Ziv
problem) and channel coding with encoder side-information (Gel'fand-Pinsker
problem) can be optimally solved using the separation principle. In this work
we show an alternative scheme for the quadratic-Gaussian case, which merges
source and channel coding. This scheme achieves the optimal performance by a
applying modulo-lattice modulation to the analog source. Thus it saves the
complexity of quantization and channel decoding, and remains with the task of
"shaping" only. Furthermore, for high signal-to-noise ratio (SNR), the scheme
approaches the optimal performance using an SNR-independent encoder, thus it is
robust to unknown SNR at the encoder.Comment: Submitted to IEEE Transactions on Information Theory. Presented in
part in ISIT-2006, Seattle. New version after revie
Wyner-Ziv Coding over Broadcast Channels: Digital Schemes
This paper addresses lossy transmission of a common source over a broadcast
channel when there is correlated side information at the receivers, with
emphasis on the quadratic Gaussian and binary Hamming cases. A digital scheme
that combines ideas from the lossless version of the problem, i.e.,
Slepian-Wolf coding over broadcast channels, and dirty paper coding, is
presented and analyzed. This scheme uses layered coding where the common layer
information is intended for both receivers and the refinement information is
destined only for one receiver. For the quadratic Gaussian case, a quantity
characterizing the overall quality of each receiver is identified in terms of
channel and side information parameters. It is shown that it is more
advantageous to send the refinement information to the receiver with "better"
overall quality. In the case where all receivers have the same overall quality,
the presented scheme becomes optimal. Unlike its lossless counterpart, however,
the problem eludes a complete characterization
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
- …