41 research outputs found
Resolvability on Continuous Alphabets
We characterize the resolvability region for a large class of point-to-point
channels with continuous alphabets. In our direct result, we prove not only the
existence of good resolvability codebooks, but adapt an approach based on the
Chernoff-Hoeffding bound to the continuous case showing that the probability of
drawing an unsuitable codebook is doubly exponentially small. For the converse
part, we show that our previous elementary result carries over to the
continuous case easily under some mild continuity assumption.Comment: v2: Corrected inaccuracies in proof of direct part. Statement of
Theorem 3 slightly adapted; other results unchanged v3: Extended version of
camera ready version submitted to ISIT 201
Generic Stationary Measures and Actions
Let be a countably infinite group, and let be a generating
probability measure on . We study the space of -stationary Borel
probability measures on a topological space, and in particular on ,
where is any perfect Polish space. We also study the space of
-stationary, measurable -actions on a standard, nonatomic probability
space.
Equip the space of stationary measures with the weak* topology. When
has finite entropy, we show that a generic measure is an essentially free
extension of the Poisson boundary of . When is compact, this
implies that the simplex of -stationary measures on is a Poulsen
simplex. We show that this is also the case for the simplex of stationary
measures on .
We furthermore show that if the action of on its Poisson boundary is
essentially free then a generic measure is isomorphic to the Poisson boundary.
Next, we consider the space of stationary actions, equipped with a standard
topology known as the weak topology. Here we show that when has property
(T), the ergodic actions are meager. We also construct a group without
property (T) such that the ergodic actions are not dense, for some .
Finally, for a weaker topology on the set of actions, which we call the very
weak topology, we show that a dynamical property (e.g., ergodicity) is
topologically generic if and only if it is generic in the space of measures.
There we also show a Glasner-King type 0-1 law stating that every dynamical
property is either meager or residual.Comment: To appear in the Transactions of the AMS, 49 page
R\'enyi Divergence and Kullback-Leibler Divergence
R\'enyi divergence is related to R\'enyi entropy much like Kullback-Leibler
divergence is related to Shannon's entropy, and comes up in many settings. It
was introduced by R\'enyi as a measure of information that satisfies almost the
same axioms as Kullback-Leibler divergence, and depends on a parameter that is
called its order. In particular, the R\'enyi divergence of order 1 equals the
Kullback-Leibler divergence.
We review and extend the most important properties of R\'enyi divergence and
Kullback-Leibler divergence, including convexity, continuity, limits of
-algebras and the relation of the special order 0 to the Gaussian
dichotomy and contiguity. We also show how to generalize the Pythagorean
inequality to orders different from 1, and we extend the known equivalence
between channel capacity and minimax redundancy to continuous channel inputs
(for all orders) and present several other minimax results.Comment: To appear in IEEE Transactions on Information Theor
Comparison of Channels: Criteria for Domination by a Symmetric Channel
This paper studies the basic question of whether a given channel can be
dominated (in the precise sense of being more noisy) by a -ary symmetric
channel. The concept of "less noisy" relation between channels originated in
network information theory (broadcast channels) and is defined in terms of
mutual information or Kullback-Leibler divergence. We provide an equivalent
characterization in terms of -divergence. Furthermore, we develop a
simple criterion for domination by a -ary symmetric channel in terms of the
minimum entry of the stochastic matrix defining the channel . The criterion
is strengthened for the special case of additive noise channels over finite
Abelian groups. Finally, it is shown that domination by a symmetric channel
implies (via comparison of Dirichlet forms) a logarithmic Sobolev inequality
for the original channel.Comment: 31 pages, 2 figures. Presented at 2017 IEEE International Symposium
on Information Theory (ISIT
Information Theoretic Proofs of Entropy Power Inequalities
While most useful information theoretic inequalities can be deduced from the
basic properties of entropy or mutual information, up to now Shannon's entropy
power inequality (EPI) is an exception: Existing information theoretic proofs
of the EPI hinge on representations of differential entropy using either Fisher
information or minimum mean-square error (MMSE), which are derived from de
Bruijn's identity. In this paper, we first present an unified view of these
proofs, showing that they share two essential ingredients: 1) a data processing
argument applied to a covariance-preserving linear transformation; 2) an
integration over a path of a continuous Gaussian perturbation. Using these
ingredients, we develop a new and brief proof of the EPI through a mutual
information inequality, which replaces Stam and Blachman's Fisher information
inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in
earlier proofs. The result has the advantage of being very simple in that it
relies only on the basic properties of mutual information. These ideas are then
generalized to various extended versions of the EPI: Zamir and Feder's
generalized EPI for linear transformations of the random variables, Takano and
Johnson's EPI for dependent variables, Liu and Viswanath's
covariance-constrained EPI, and Costa's concavity inequality for the entropy
power.Comment: submitted for publication in the IEEE Transactions on Information
Theory, revised versio