105,898 research outputs found
Dissipation of information in channels with input constraints
One of the basic tenets in information theory, the data processing inequality
states that output divergence does not exceed the input divergence for any
channel. For channels without input constraints, various estimates on the
amount of such contraction are known, Dobrushin's coefficient for the total
variation being perhaps the most well-known. This work investigates channels
with average input cost constraint. It is found that while the contraction
coefficient typically equals one (no contraction), the information nevertheless
dissipates. A certain non-linear function, the \emph{Dobrushin curve} of the
channel, is proposed to quantify the amount of dissipation. Tools for
evaluating the Dobrushin curve of additive-noise channels are developed based
on coupling arguments. Some basic applications in stochastic control,
uniqueness of Gibbs measures and fundamental limits of noisy circuits are
discussed.
As an application, it shown that in the chain of power-constrained relays
and Gaussian channels the end-to-end mutual information and maximal squared
correlation decay as , which is in stark
contrast with the exponential decay in chains of discrete channels. Similarly,
the behavior of noisy circuits (composed of gates with bounded fan-in) and
broadcasting of information on trees (of bounded degree) does not experience
threshold behavior in the signal-to-noise ratio (SNR). Namely, unlike the case
of discrete channels, the probability of bit error stays bounded away from
regardless of the SNR.Comment: revised; include appendix B on contraction coefficient for mutual
information on general alphabet
Mutual Information in Frequency and its Application to Measure Cross-Frequency Coupling in Epilepsy
We define a metric, mutual information in frequency (MI-in-frequency), to
detect and quantify the statistical dependence between different frequency
components in the data, referred to as cross-frequency coupling and apply it to
electrophysiological recordings from the brain to infer cross-frequency
coupling. The current metrics used to quantify the cross-frequency coupling in
neuroscience cannot detect if two frequency components in non-Gaussian brain
recordings are statistically independent or not. Our MI-in-frequency metric,
based on Shannon's mutual information between the Cramer's representation of
stochastic processes, overcomes this shortcoming and can detect statistical
dependence in frequency between non-Gaussian signals. We then describe two
data-driven estimators of MI-in-frequency: one based on kernel density
estimation and the other based on the nearest neighbor algorithm and validate
their performance on simulated data. We then use MI-in-frequency to estimate
mutual information between two data streams that are dependent across time,
without making any parametric model assumptions. Finally, we use the MI-in-
frequency metric to investigate the cross-frequency coupling in seizure onset
zone from electrocorticographic recordings during seizures. The inferred
cross-frequency coupling characteristics are essential to optimize the spatial
and spectral parameters of electrical stimulation based treatments of epilepsy.Comment: This paper is accepted for publication in IEEE Transactions on Signal
Processing and contains 15 pages, 9 figures and 1 tabl
- …