12,173 research outputs found
Entropic bounds on coding for noisy quantum channels
In analogy with its classical counterpart, a noisy quantum channel is
characterized by a loss, a quantity that depends on the channel input and the
quantum operation performed by the channel. The loss reflects the transmission
quality: if the loss is zero, quantum information can be perfectly transmitted
at a rate measured by the quantum source entropy. By using block coding based
on sequences of n entangled symbols, the average loss (defined as the overall
loss of the joint n-symbol channel divided by n, when n tends to infinity) can
be made lower than the loss for a single use of the channel. In this context,
we examine several upper bounds on the rate at which quantum information can be
transmitted reliably via a noisy channel, that is, with an asymptotically
vanishing average loss while the one-symbol loss of the channel is non-zero.
These bounds on the channel capacity rely on the entropic Singleton bound on
quantum error-correcting codes [Phys. Rev. A 56, 1721 (1997)]. Finally, we
analyze the Singleton bounds when the noisy quantum channel is supplemented
with a classical auxiliary channel.Comment: 20 pages RevTeX, 10 Postscript figures. Expanded Section II, added 1
figure, changed title. To appear in Phys. Rev. A (May 98
R\'enyi Divergence and Kullback-Leibler Divergence
R\'enyi divergence is related to R\'enyi entropy much like Kullback-Leibler
divergence is related to Shannon's entropy, and comes up in many settings. It
was introduced by R\'enyi as a measure of information that satisfies almost the
same axioms as Kullback-Leibler divergence, and depends on a parameter that is
called its order. In particular, the R\'enyi divergence of order 1 equals the
Kullback-Leibler divergence.
We review and extend the most important properties of R\'enyi divergence and
Kullback-Leibler divergence, including convexity, continuity, limits of
-algebras and the relation of the special order 0 to the Gaussian
dichotomy and contiguity. We also show how to generalize the Pythagorean
inequality to orders different from 1, and we extend the known equivalence
between channel capacity and minimax redundancy to continuous channel inputs
(for all orders) and present several other minimax results.Comment: To appear in IEEE Transactions on Information Theor
Minimum and maximum entropy distributions for binary systems with known means and pairwise correlations
Maximum entropy models are increasingly being used to describe the collective
activity of neural populations with measured mean neural activities and
pairwise correlations, but the full space of probability distributions
consistent with these constraints has not been explored. We provide upper and
lower bounds on the entropy for the {\em minimum} entropy distribution over
arbitrarily large collections of binary units with any fixed set of mean values
and pairwise correlations. We also construct specific low-entropy distributions
for several relevant cases. Surprisingly, the minimum entropy solution has
entropy scaling logarithmically with system size for any set of first- and
second-order statistics consistent with arbitrarily large systems. We further
demonstrate that some sets of these low-order statistics can only be realized
by small systems. Our results show how only small amounts of randomness are
needed to mimic low-order statistical properties of highly entropic
distributions, and we discuss some applications for engineered and biological
information transmission systems.Comment: 34 pages, 7 figure
- …