3 research outputs found
Reversals of R\'enyi Entropy Inequalities under Log-Concavity
We establish a discrete analog of the R\'enyi entropy comparison due to
Bobkov and Madiman. For log-concave variables on the integers, the min entropy
is within log e of the usual Shannon entropy. Additionally we investigate the
entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and
establish a sharp R\'enyi version for certain parameters in both the continuous
and discrete case
On Renyi Entropy Power Inequalities
This paper gives improved R\'{e}nyi entropy power inequalities (R-EPIs).
Consider a sum of independent continuous random
vectors taking values on , and let . An
R-EPI provides a lower bound on the order- R\'enyi entropy power of
that, up to a multiplicative constant (which may depend in general on ), is equal to the sum of the order- R\'enyi entropy powers
of the random vectors . For , the R-EPI
coincides with the well-known entropy power inequality by Shannon. The first
improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and
Chistyakov which relies on the sharpened Young's inequality. A further
improvement of the R-EPI also relies on convex optimization and results on
rank-one modification of a real-valued diagonal matrix.Comment: Revised version of a submission to the IEEE Trans. on Information
Theory. Presented in part at ISIT 2016, Barcelona, July 201
Forward and Reverse Entropy Power Inequalities in Convex Geometry
The entropy power inequality, which plays a fundamental role in information
theory and probability, may be seen as an analogue of the Brunn-Minkowski
inequality. Motivated by this connection to Convex Geometry, we survey various
recent developments on forward and reverse entropy power inequalities not just
for the Shannon-Boltzmann entropy but also more generally for R\'enyi entropy.
In the process, we discuss connections between the so-called functional (or
integral) and probabilistic (or entropic) analogues of some classical
inequalities in geometric functional analysisComment: 54 pages. Changes in v2: improved organization, cleaned up
exposition, and numerous references adde