90 research outputs found

    Polarization of the Renyi Information Dimension with Applications to Compressed Sensing

    Full text link
    In this paper, we show that the Hadamard matrix acts as an extractor over the reals of the Renyi information dimension (RID), in an analogous way to how it acts as an extractor of the discrete entropy over finite fields. More precisely, we prove that the RID of an i.i.d. sequence of mixture random variables polarizes to the extremal values of 0 and 1 (corresponding to discrete and continuous distributions) when transformed by a Hadamard matrix. Further, we prove that the polarization pattern of the RID admits a closed form expression and follows exactly the Binary Erasure Channel (BEC) polarization pattern in the discrete setting. We also extend the results from the single- to the multi-terminal setting, obtaining a Slepian-Wolf counterpart of the RID polarization. We discuss applications of the RID polarization to Compressed Sensing of i.i.d. sources. In particular, we use the RID polarization to construct a family of deterministic ±1\pm 1-valued sensing matrices for Compressed Sensing. We run numerical simulations to compare the performance of the resulting matrices with that of random Gaussian and random Hadamard matrices. The results indicate that the proposed matrices afford competitive performances while being explicitly constructed.Comment: 12 pages, 2 figure

    An information-theoretic proof of Nash's inequality

    Full text link
    We show that an information-theoretic property of Shannon's entropy power, known as concavity of entropy power, can be fruitfully employed to prove inequalities in sharp form. In particular, the concavity of entropy power implies the logarithmic Sobolev inequality, and Nash's inequality with the sharp constant

    The concavity of R\`enyi entropy power

    Full text link
    We associate to the p-th R\'enyi entropy a definition of entropy power, which is the natural extension of Shannon's entropy power and exhibits a nice behaviour along solutions to the p-nonlinear heat equation in RnR^n. We show that the R\'enyi entropy power of general probability densities solving such equations is always a concave function of time, whereas it has a linear behaviour in correspondence to the Barenblatt source-type solutions. We then shown that the p-th R\'enyi entropy power of a probability density which solves the nonlinear diffusion of order p, is a concave function of time. This result extends Costa's concavity inequality for Shannon's entropy power to R\'enyi entropies

    A multivariate generalization of Costa's entropy power inequality

    Full text link
    A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    At Every Corner: Determining Corner Points of Two-User Gaussian Interference Channels

    Full text link
    The corner points of the capacity region of the two-user Gaussian interference channel under strong or weak interference are determined using the notions of almost Gaussian random vectors, almost lossless addition of random vectors, and almost linearly dependent random vectors. In particular, the "missing" corner point problem is solved in a manner that differs from previous works in that it avoids the use of integration over a continuum of SNR values or of Monge-Kantorovitch transportation problems
    • …
    corecore