7,051 research outputs found
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
On optimum parameter modulation-estimation from a large deviations perspective
We consider the problem of jointly optimum modulation and estimation of a
real-valued random parameter, conveyed over an additive white Gaussian noise
(AWGN) channel, where the performance metric is the large deviations behavior
of the estimator, namely, the exponential decay rate (as a function of the
observation time) of the probability that the estimation error would exceed a
certain threshold. Our basic result is in providing an exact characterization
of the fastest achievable exponential decay rate, among all possible
modulator-estimator (transmitter-receiver) pairs, where the modulator is
limited only in the signal power, but not in bandwidth. This exponential rate
turns out to be given by the reliability function of the AWGN channel. We also
discuss several ways to achieve this optimum performance, and one of them is
based on quantization of the parameter, followed by optimum channel coding and
modulation, which gives rise to a separation-based transmitter, if one views
this setting from the perspective of joint source-channel coding. This is in
spite of the fact that, in general, when error exponents are considered, the
source-channel separation theorem does not hold true. We also discuss several
observations, modifications and extensions of this result in several
directions, including other channels, and the case of multidimensional
parameter vectors. One of our findings concerning the latter, is that there is
an abrupt threshold effect in the dimensionality of the parameter vector: below
a certain critical dimension, the probability of excess estimation error may
still decay exponentially, but beyond this value, it must converge to unity.Comment: 26 pages; Submitted to the IEEE Transactions on Information Theor
Source-Channel Diversity for Parallel Channels
We consider transmitting a source across a pair of independent, non-ergodic
channels with random states (e.g., slow fading channels) so as to minimize the
average distortion. The general problem is unsolved. Hence, we focus on
comparing two commonly used source and channel encoding systems which
correspond to exploiting diversity either at the physical layer through
parallel channel coding or at the application layer through multiple
description source coding.
For on-off channel models, source coding diversity offers better performance.
For channels with a continuous range of reception quality, we show the reverse
is true. Specifically, we introduce a new figure of merit called the distortion
exponent which measures how fast the average distortion decays with SNR. For
continuous-state models such as additive white Gaussian noise channels with
multiplicative Rayleigh fading, optimal channel coding diversity at the
physical layer is more efficient than source coding diversity at the
application layer in that the former achieves a better distortion exponent.
Finally, we consider a third decoding architecture: multiple description
encoding with a joint source-channel decoding. We show that this architecture
achieves the same distortion exponent as systems with optimal channel coding
diversity for continuous-state channels, and maintains the the advantages of
multiple description systems for on-off channels. Thus, the multiple
description system with joint decoding achieves the best performance, from
among the three architectures considered, on both continuous-state and on-off
channels.Comment: 48 pages, 14 figure
Lossy joint source-channel coding in the finite blocklength regime
This paper finds new tight finite-blocklength bounds for the best achievable
lossy joint source-channel code rate, and demonstrates that joint
source-channel code design brings considerable performance advantage over a
separate one in the non-asymptotic regime. A joint source-channel code maps a
block of source symbols onto a length channel codeword, and the
fidelity of reproduction at the receiver end is measured by the probability
that the distortion exceeds a given threshold . For memoryless
sources and channels, it is demonstrated that the parameters of the best joint
source-channel code must satisfy , where and are the channel capacity and channel
dispersion, respectively; and are the source
rate-distortion and rate-dispersion functions; and is the standard Gaussian
complementary cdf. Symbol-by-symbol (uncoded) transmission is known to achieve
the Shannon limit when the source and channel satisfy a certain probabilistic
matching condition. In this paper we show that even when this condition is not
satisfied, symbol-by-symbol transmission is, in some cases, the best known
strategy in the non-asymptotic regime
- …