3 research outputs found

    Reversals of R\'enyi Entropy Inequalities under Log-Concavity

    Full text link
    We establish a discrete analog of the R\'enyi entropy comparison due to Bobkov and Madiman. For log-concave variables on the integers, the min entropy is within log e of the usual Shannon entropy. Additionally we investigate the entropic Rogers-Shephard inequality studied by Madiman and Kontoyannis, and establish a sharp R\'enyi version for certain parameters in both the continuous and discrete case

    On Renyi Entropy Power Inequalities

    Full text link
    This paper gives improved R\'{e}nyi entropy power inequalities (R-EPIs). Consider a sum Sn=βˆ‘k=1nXkS_n = \sum_{k=1}^n X_k of nn independent continuous random vectors taking values on Rd\mathbb{R}^d, and let α∈[1,∞]\alpha \in [1, \infty]. An R-EPI provides a lower bound on the order-Ξ±\alpha R\'enyi entropy power of SnS_n that, up to a multiplicative constant (which may depend in general on n,Ξ±,dn, \alpha, d), is equal to the sum of the order-Ξ±\alpha R\'enyi entropy powers of the nn random vectors {Xk}k=1n\{X_k\}_{k=1}^n. For Ξ±=1\alpha=1, the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov which relies on the sharpened Young's inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.Comment: Revised version of a submission to the IEEE Trans. on Information Theory. Presented in part at ISIT 2016, Barcelona, July 201

    Forward and Reverse Entropy Power Inequalities in Convex Geometry

    Full text link
    The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of the Brunn-Minkowski inequality. Motivated by this connection to Convex Geometry, we survey various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for R\'enyi entropy. In the process, we discuss connections between the so-called functional (or integral) and probabilistic (or entropic) analogues of some classical inequalities in geometric functional analysisComment: 54 pages. Changes in v2: improved organization, cleaned up exposition, and numerous references adde
    corecore