27 research outputs found

    Two remarks on generalized entropy power inequalities

    Full text link
    This note contributes to the understanding of generalized entropy power inequalities. Our main goal is to construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables. We also present a complex analogue of a recent dependent entropy power inequality of Hao and Jog, and give a very simple proof.Comment: arXiv:1811.00345 is split into 2 papers, with this being on

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur

    R\'enyi Entropy Power Inequalities via Normal Transport and Rotation

    Full text link
    Following a recent proof of Shannon's entropy power inequality (EPI), a comprehensive framework for deriving various EPIs for the R\'enyi entropy is presented that uses transport arguments from normal densities and a change of variable by rotation. Simple arguments are given to recover the previously known R\'enyi EPIs and derive new ones, by unifying a multiplicative form with constant c and a modification with exponent {\alpha} of previous works. In particular, for log-concave densities, we obtain a simple transportation proof of a sharp varentropy bound.Comment: 17 page. Entropy Journal, to appea
    corecore