16 research outputs found
Information Nonanticipative Rate Distortion Function and Its Applications
This paper investigates applications of nonanticipative Rate Distortion
Function (RDF) in a) zero-delay Joint Source-Channel Coding (JSCC) design based
on average and excess distortion probability, b) in bounding the Optimal
Performance Theoretically Attainable (OPTA) by noncausal and causal codes, and
computing the Rate Loss (RL) of zero-delay and causal codes with respect to
noncausal codes. These applications are described using two running examples,
the Binary Symmetric Markov Source with parameter p, (BSMS(p)) and the
multidimensional partially observed Gaussian-Markov source. For the
multidimensional Gaussian-Markov source with square error distortion, the
solution of the nonanticipative RDF is derived, its operational meaning using
JSCC design via a noisy coding theorem is shown by providing the optimal
encoding-decoding scheme over a vector Gaussian channel, and the RL of causal
and zero-delay codes with respect to noncausal codes is computed.
For the BSMS(p) with Hamming distortion, the solution of the nonanticipative
RDF is derived, the RL of causal codes with respect to noncausal codes is
computed, and an uncoded noisy coding theorem based on excess distortion
probability is shown. The information nonanticipative RDF is shown to be
equivalent to the nonanticipatory epsilon-entropy, which corresponds to the
classical RDF with an additional causality or nonanticipative condition imposed
on the optimal reproduction conditional distribution.Comment: 34 pages, 12 figures, part of this paper was accepted for publication
in IEEE International Symposium on Information Theory (ISIT), 2014 and in
book Coordination Control of Distributed Systems of series Lecture Notes in
Control and Information Sciences, 201
Applications of Information Nonanticipative Rate Distortion Function
The objective of this paper is to further investigate various applications of
information Nonanticipative Rate Distortion Function (NRDF) by discussing two
working examples, the Binary Symmetric Markov Source with parameter
(BSMS()) with Hamming distance distortion, and the multidimensional
partially observed Gaussian-Markov source. For the BSMS(), we give the
solution to the NRDF, and we use it to compute the Rate Loss (RL) of causal
codes with respect to noncausal codes. For the multidimensional Gaussian-Markov
source, we give the solution to the NRDF, we show its operational meaning via
joint source-channel matching over a vector of parallel Gaussian channels, and
we compute the RL of causal and zero-delay codes with respect to noncausal
codes.Comment: 5 pages, 3 figures, accepted for publication in IEEE International
Symposium on Information Theory (ISIT) proceedings, 201
Sequential Necessary and Sufficient Conditions for Capacity Achieving Distributions of Channels with Memory and Feedback
We derive sequential necessary and sufficient conditions for any channel
input conditional distribution to maximize the
finite-time horizon directed information defined by for channel distributions
and
, where
and are the
channel input and output random processes, and is a finite nonnegative
integer.
\noi We apply the necessary and sufficient conditions to application examples
of time-varying channels with memory and we derive recursive closed form
expressions of the optimal distributions, which maximize the finite-time
horizon directed information. Further, we derive the feedback capacity from the
asymptotic properties of the optimal distributions by investigating the limit
without any \'a priori
assumptions, such as, stationarity, ergodicity or irreducibility of the channel
distribution. The necessary and sufficient conditions can be easily extended to
a variety of channels with memory, beyond the ones considered in this paper.Comment: 57 pages, 9 figures, part of the paper was accepted for publication
in the proceedings of the IEEE International Symposium on Information Theory
(ISIT), Barcelona, Spain 10-15 July, 2016 (Date of submission of the
conference paper: 25/1/2016
Causal Rate Distortion Function on Abstract Alphabets: Optimal Reconstruction and Properties
A causal rate distortion function with a general fidelity criterion is
formulated on abstract alphabets and a coding theorem is derived. Existence of
the minimizing kernel is shown using the topology of weak convergence of
probability measures. The optimal reconstruction kernel is derived, which is
causal, and certain properties of the causal rate distortion function are
presented.Comment: 5 pages, Submitted to Internation Symposium on Information
Theory(ISIT) 201
Capacity of Binary State Symmetric Channel with and without Feedback and Transmission Cost
We consider a unit memory channel, called Binary State Symmetric Channel
(BSSC), in which the channel state is the modulo2 addition of the current
channel input and the previous channel output. We derive closed form
expressions for the capacity and corresponding channel input distribution, of
this BSSC with and without feedback and transmission cost. We also show that
the capacity of the BSSC is not increased by feedback, and it is achieved by a
first order symmetric Markov process