5,601 research outputs found
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
A Unified Approach for Network Information Theory
In this paper, we take a unified approach for network information theory and
prove a coding theorem, which can recover most of the achievability results in
network information theory that are based on random coding. The final
single-letter expression has a very simple form, which was made possible by
many novel elements such as a unified framework that represents various network
problems in a simple and unified way, a unified coding strategy that consists
of a few basic ingredients but can emulate many known coding techniques if
needed, and new proof techniques beyond the use of standard covering and
packing lemmas. For example, in our framework, sources, channels, states and
side information are treated in a unified way and various constraints such as
cost and distortion constraints are unified as a single joint-typicality
constraint.
Our theorem can be useful in proving many new achievability results easily
and in some cases gives simpler rate expressions than those obtained using
conventional approaches. Furthermore, our unified coding can strictly
outperform existing schemes. For example, we obtain a generalized
decode-compress-amplify-and-forward bound as a simple corollary of our main
theorem and show it strictly outperforms previously known coding schemes. Using
our unified framework, we formally define and characterize three types of
network duality based on channel input-output reversal and network flow
reversal combined with packing-covering duality.Comment: 52 pages, 7 figures, submitted to IEEE Transactions on Information
theory, a shorter version will appear in Proc. IEEE ISIT 201
- …