101 research outputs found

    A conditional Entropy Power Inequality for dependent variables

    Full text link
    We provide a condition under which a version of Shannon's Entropy Power Inequality will hold for dependent variables. We provide information inequalities extending those found in the independent case.Comment: 6 page

    Volumes of Restricted Minkowski Sums and the Free Analogue of the Entropy Power Inequality

    Full text link
    In noncommutative probability theory independence can be based on free products instead of tensor products. This yields a highly noncommutative theory: free probability . Here we show that the classical Shannon's entropy power inequality has a counterpart for the free analogue of entropy . The free entropy (introduced recently by the second named author), consistently with Boltzmann's formula S=klogWS=k\log W, was defined via volumes of matricial microstates. Proving the free entropy power inequality naturally becomes a geometric question. Restricting the Minkowski sum of two sets means to specify the set of pairs of points which will be added. The relevant inequality, which holds when the set of "addable" points is sufficiently large, differs from the Brunn-Minkowski inequality by having the exponent 1/n1/n replaced by 2/n2/n. Its proof uses the rearrangement inequality of Brascamp-Lieb-L\"uttinger

    An information-theoretic proof of Nash's inequality

    Full text link
    We show that an information-theoretic property of Shannon's entropy power, known as concavity of entropy power, can be fruitfully employed to prove inequalities in sharp form. In particular, the concavity of entropy power implies the logarithmic Sobolev inequality, and Nash's inequality with the sharp constant

    The concavity of R\`enyi entropy power

    Full text link
    We associate to the p-th R\'enyi entropy a definition of entropy power, which is the natural extension of Shannon's entropy power and exhibits a nice behaviour along solutions to the p-nonlinear heat equation in RnR^n. We show that the R\'enyi entropy power of general probability densities solving such equations is always a concave function of time, whereas it has a linear behaviour in correspondence to the Barenblatt source-type solutions. We then shown that the p-th R\'enyi entropy power of a probability density which solves the nonlinear diffusion of order p, is a concave function of time. This result extends Costa's concavity inequality for Shannon's entropy power to R\'enyi entropies

    A multivariate generalization of Costa's entropy power inequality

    Full text link
    A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.Comment: Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    On the Noisy Feedback Capacity of Gaussian Broadcast Channels

    Full text link
    It is well known that, in general, feedback may enlarge the capacity region of Gaussian broadcast channels. This has been demonstrated even when the feedback is noisy (or partial-but-perfect) and only from one of the receivers. The only case known where feedback has been shown not to enlarge the capacity region is when the channel is physically degraded (El Gamal 1978, 1981). In this paper, we show that for a class of two-user Gaussian broadcast channels (not necessarily physically degraded), passively feeding back the stronger user's signal over a link corrupted by Gaussian noise does not enlarge the capacity region if the variance of feedback noise is above a certain threshold.Comment: 5 pages, 3 figures, to appear in IEEE Information Theory Workshop 2015, Jerusale

    On some special cases of the Entropy Photon-Number Inequality

    Full text link
    We show that the Entropy Photon-Number Inequality (EPnI) holds where one of the input states is the vacuum state and for several candidates of the other input state that includes the cases when the state has the eigenvectors as the number states and either has only two non-zero eigenvalues or has arbitrary number of non-zero eigenvalues but is a high entropy state. We also discuss the conditions, which if satisfied, would lead to an extension of these results.Comment: 12 pages, no figure
    corecore