171 research outputs found
Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Non-stationary/Unstable Linear Systems
Stabilization of non-stationary linear systems over noisy communication
channels is considered. Stochastically stable sources, and unstable but
noise-free or bounded-noise systems have been extensively studied in
information theory and control theory literature since 1970s, with a renewed
interest in the past decade. There have also been studies on non-causal and
causal coding of unstable/non-stationary linear Gaussian sources. In this
paper, tight necessary and sufficient conditions for stochastic stabilizability
of unstable (non-stationary) possibly multi-dimensional linear systems driven
by Gaussian noise over discrete channels (possibly with memory and feedback)
are presented. Stochastic stability notions include recurrence, asymptotic mean
stationarity and sample path ergodicity, and the existence of finite second
moments. Our constructive proof uses random-time state-dependent stochastic
drift criteria for stabilization of Markov chains. For asymptotic mean
stationarity (and thus sample path ergodicity), it is sufficient that the
capacity of a channel is (strictly) greater than the sum of the logarithms of
the unstable pole magnitudes for memoryless channels and a class of channels
with memory. This condition is also necessary under a mild technical condition.
Sufficient conditions for the existence of finite average second moments for
such systems driven by unbounded noise are provided.Comment: To appear in IEEE Transactions on Information Theor
Slepian-Wolf Coding for Broadcasting with Cooperative Base-Stations
We propose a base-station (BS) cooperation model for broadcasting a discrete
memoryless source in a cellular or heterogeneous network. The model allows the
receivers to use helper BSs to improve network performance, and it permits the
receivers to have prior side information about the source. We establish the
model's information-theoretic limits in two operational modes: In Mode 1, the
helper BSs are given information about the channel codeword transmitted by the
main BS, and in Mode 2 they are provided correlated side information about the
source. Optimal codes for Mode 1 use \emph{hash-and-forward coding} at the
helper BSs; while, in Mode 2, optimal codes use source codes from Wyner's
\emph{helper source-coding problem} at the helper BSs. We prove the optimality
of both approaches by way of a new list-decoding generalisation of [8, Thm. 6],
and, in doing so, show an operational duality between Modes 1 and 2.Comment: 16 pages, 1 figur
Information Nonanticipative Rate Distortion Function and Its Applications
This paper investigates applications of nonanticipative Rate Distortion
Function (RDF) in a) zero-delay Joint Source-Channel Coding (JSCC) design based
on average and excess distortion probability, b) in bounding the Optimal
Performance Theoretically Attainable (OPTA) by noncausal and causal codes, and
computing the Rate Loss (RL) of zero-delay and causal codes with respect to
noncausal codes. These applications are described using two running examples,
the Binary Symmetric Markov Source with parameter p, (BSMS(p)) and the
multidimensional partially observed Gaussian-Markov source. For the
multidimensional Gaussian-Markov source with square error distortion, the
solution of the nonanticipative RDF is derived, its operational meaning using
JSCC design via a noisy coding theorem is shown by providing the optimal
encoding-decoding scheme over a vector Gaussian channel, and the RL of causal
and zero-delay codes with respect to noncausal codes is computed.
For the BSMS(p) with Hamming distortion, the solution of the nonanticipative
RDF is derived, the RL of causal codes with respect to noncausal codes is
computed, and an uncoded noisy coding theorem based on excess distortion
probability is shown. The information nonanticipative RDF is shown to be
equivalent to the nonanticipatory epsilon-entropy, which corresponds to the
classical RDF with an additional causality or nonanticipative condition imposed
on the optimal reproduction conditional distribution.Comment: 34 pages, 12 figures, part of this paper was accepted for publication
in IEEE International Symposium on Information Theory (ISIT), 2014 and in
book Coordination Control of Distributed Systems of series Lecture Notes in
Control and Information Sciences, 201
Lecture Notes on Network Information Theory
These lecture notes have been converted to a book titled Network Information
Theory published recently by Cambridge University Press. This book provides a
significantly expanded exposition of the material in the lecture notes as well
as problems and bibliographic notes at the end of each chapter. The authors are
currently preparing a set of slides based on the book that will be posted in
the second half of 2012. More information about the book can be found at
http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of
the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/
- …