18 research outputs found

    Two remarks on generalized entropy power inequalities

    Full text link
    This note contributes to the understanding of generalized entropy power inequalities. Our main goal is to construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables. We also present a complex analogue of a recent dependent entropy power inequality of Hao and Jog, and give a very simple proof.Comment: arXiv:1811.00345 is split into 2 papers, with this being on

    Improved bounds for Hadwiger's covering problem via thin-shell estimates

    Full text link
    A central problem in discrete geometry, known as Hadwiger's covering problem, asks what the smallest natural number N(n)N\left(n\right) is such that every convex body in Rn{\mathbb R}^{n} can be covered by a union of the interiors of at most N(n)N\left(n\right) of its translates. Despite continuous efforts, the best general upper bound known for this number remains as it was more than sixty years ago, of the order of (2nn)nln⁑n{2n \choose n}n\ln n. In this note, we improve this bound by a sub-exponential factor. That is, we prove a bound of the order of (2nn)eβˆ’cn{2n \choose n}e^{-c\sqrt{n}} for some universal constant c>0c>0. Our approach combines ideas from previous work by Artstein-Avidan and the second named author with tools from Asymptotic Geometric Analysis. One of the key steps is proving a new lower bound for the maximum volume of the intersection of a convex body KK with a translate of βˆ’K-K; in fact, we get the same lower bound for the volume of the intersection of KK and βˆ’K-K when they both have barycenter at the origin. To do so, we make use of measure concentration, and in particular of thin-shell estimates for isotropic log-concave measures. Using the same ideas, we establish an exponentially better bound for N(n)N\left(n\right) when restricting our attention to convex bodies that are ψ2\psi_{2}. By a slightly different approach, an exponential improvement is established also for classes of convex bodies with positive modulus of convexity

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur
    corecore