2,858 research outputs found
Anytime information theory
Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2001.Includes bibliographical references (p. 171-175).We study the reliable communication of delay-sensitive bit streams through noisy channels. To bring the issues into sharp focus, we will focus on the specific problem of communicating the values of an unstable real-valued discrete-time Markov random process through a finite capacity noisy channel so as to have finite average squared error from end-to-end. On the source side, we give a coding theorem for such unstable processes that shows that we can achieve the rate-distortion bound even in the infinite horizon case if we are willing to tolerate bounded delays in encoding and decoding. On the channel side, we define a new parametric notion of capacity called anytime capacity that corresponds to a sense of reliable transmission that is stronger than the traditional Shannon capacity sense but is less demanding than the sense underlying zero-error capacity. We show that anytime capacity exists for memoryless channels without feedback and is connected to standard random coding error exponents. The main result of the thesis is a new source/channel separation theorem that encompasses unstable processes and establishes that the stronger notion of anytime capacity is required to be able to deal with delay-sensitive bit streams. This theorem is then applied in the control systems context to show that anytime capacity is also required to evaluate channels if we intend to use them as part of a feedback link from sensing to actuation. Finally, the theorem is used to shed light on the concept of "quality of service requirements" by examining a toy mathematical example for which we prove the absolute necessity of differentiated service without appealing to human preferences.by Anant Sahai.Ph.D
Mean Square Capacity of Power Constrained Fading Channels with Causal Encoders and Decoders
This paper is concerned with the mean square stabilization problem of
discrete-time LTI systems over a power constrained fading channel. Different
from existing research works, the channel considered in this paper suffers from
both fading and additive noises. We allow any form of causal channel
encoders/decoders, unlike linear encoders/decoders commonly studied in the
literature. Sufficient conditions and necessary conditions for the mean square
stabilizability are given in terms of channel parameters such as transmission
power and fading and additive noise statistics in relation to the unstable
eigenvalues of the open-loop system matrix. The corresponding mean square
capacity of the power constrained fading channel under causal encoders/decoders
is given. It is proved that this mean square capacity is smaller than the
corresponding Shannon channel capacity. In the end, numerical examples are
presented, which demonstrate that the causal encoders/decoders render less
restrictive stabilizability conditions than those under linear
encoders/decoders studied in the existing works.Comment: Accepted by the 54th IEEE Conference on Decision and Contro
Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Non-stationary/Unstable Linear Systems
Stabilization of non-stationary linear systems over noisy communication
channels is considered. Stochastically stable sources, and unstable but
noise-free or bounded-noise systems have been extensively studied in
information theory and control theory literature since 1970s, with a renewed
interest in the past decade. There have also been studies on non-causal and
causal coding of unstable/non-stationary linear Gaussian sources. In this
paper, tight necessary and sufficient conditions for stochastic stabilizability
of unstable (non-stationary) possibly multi-dimensional linear systems driven
by Gaussian noise over discrete channels (possibly with memory and feedback)
are presented. Stochastic stability notions include recurrence, asymptotic mean
stationarity and sample path ergodicity, and the existence of finite second
moments. Our constructive proof uses random-time state-dependent stochastic
drift criteria for stabilization of Markov chains. For asymptotic mean
stationarity (and thus sample path ergodicity), it is sufficient that the
capacity of a channel is (strictly) greater than the sum of the logarithms of
the unstable pole magnitudes for memoryless channels and a class of channels
with memory. This condition is also necessary under a mild technical condition.
Sufficient conditions for the existence of finite average second moments for
such systems driven by unbounded noise are provided.Comment: To appear in IEEE Transactions on Information Theor
The Binary Energy Harvesting Channel with a Unit-Sized Battery
We consider a binary energy harvesting communication channel with a
finite-sized battery at the transmitter. In this model, the channel input is
constrained by the available energy at each channel use, which is driven by an
external energy harvesting process, the size of the battery, and the previous
channel inputs. We consider an abstraction where energy is harvested in binary
units and stored in a battery with the capacity of a single unit, and the
channel inputs are binary. Viewing the available energy in the battery as a
state, this is a state-dependent channel with input-dependent states, memory in
the states, and causal state information available at the transmitter only. We
find an equivalent representation for this channel based on the timings of the
symbols, and determine the capacity of the resulting equivalent timing channel
via an auxiliary random variable. We give achievable rates based on certain
selections of this auxiliary random variable which resemble lattice coding for
the timing channel. We develop upper bounds for the capacity by using a
genie-aided method, and also by quantifying the leakage of the state
information to the receiver. We show that the proposed achievable rates are
asymptotically capacity achieving for small energy harvesting rates. We extend
the results to the case of ternary channel inputs. Our achievable rates give
the capacity of the binary channel within 0.03 bits/channel use, the ternary
channel within 0.05 bits/channel use, and outperform basic Shannon strategies
that only consider instantaneous battery states, for all parameter values.Comment: Submitted to IEEE Transactions on Information Theory, August 201
A Nonstochastic Information Theory for Communication and State Estimation
In communications, unknown variables are usually modelled as random
variables, and concepts such as independence, entropy and information are
defined in terms of the underlying probability distributions. In contrast,
control theory often treats uncertainties and disturbances as bounded unknowns
having no statistical structure. The area of networked control combines both
fields, raising the question of whether it is possible to construct meaningful
analogues of stochastic concepts such as independence, Markovness, entropy and
information without assuming a probability space. This paper introduces a
framework for doing so, leading to the construction of a maximin information
functional for nonstochastic variables. It is shown that the largest maximin
information rate through a memoryless, error-prone channel in this framework
coincides with the block-coding zero-error capacity of the channel. Maximin
information is then used to derive tight conditions for uniformly estimating
the state of a linear time-invariant system over such a channel, paralleling
recent results of Matveev and Savkin
Time-triggering versus event-triggering control over communication channels
Time-triggered and event-triggered control strategies for stabilization of an
unstable plant over a rate-limited communication channel subject to unknown,
bounded delay are studied and compared. Event triggering carries implicit
information, revealing the state of the plant. However, the delay in the
communication channel causes information loss, as it makes the state
information out of date. There is a critical delay value, when the loss of
information due to the communication delay perfectly compensates the implicit
information carried by the triggering events. This occurs when the maximum
delay equals the inverse of the entropy rate of the plant. In this context,
extensions of our previous results for event triggering strategies are
presented for vector systems and are compared with the data-rate theorem for
time-triggered control, that is extended here to a setting with unknown delay.Comment: To appear in the 56th IEEE Conference on Decision and Control (CDC),
Melbourne, Australia. arXiv admin note: text overlap with arXiv:1609.0959
- …