64,247 research outputs found

    On Divergence-Power Inequalities

    Full text link
    Expressions for (EPI Shannon type) Divergence-Power Inequalities (DPI) in two cases (time-discrete and band-limited time-continuous) of stationary random processes are given. The new expressions connect the divergence rate of the sum of independent processes, the individual divergence rate of each process, and their power spectral densities. All divergences are between a process and a Gaussian process with same second order statistics, and are assumed to be finite. A new proof of the Shannon entropy-power inequality EPI, based on the relationship between divergence and causal minimum mean-square error (CMMSE) in Gaussian channels with large signal-to-noise ratio, is also shown.Comment: Submitted to IEEE Transactions on Information Theor

    New information inequalities on new generalized f-divergence and applications

    Get PDF
    In this work, we introduce new information inequalities on new generalized f-divergence in terms of well known Chi-square divergence. Further we obtain relations of other standard divergence as an application of new inequalities by using Logarithmic power mean and Identric mean, together with numerical verification by taking two discrete probability distributions: Binomial and Poisson

    Conditional R\'enyi entropy and the relationships between R\'enyi capacities

    Full text link
    The analogues of Arimoto's definition of conditional R\'enyi entropy and R\'enyi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to R\'enyi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csisz\'ar, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.Comment: 17 pages, 1 figur

    Mixtures and products in two graphical models

    Full text link
    We compare two statistical models of three binary random variables. One is a mixture model and the other is a product of mixtures model called a restricted Boltzmann machine. Although the two models we study look different from their parametrizations, we show that they represent the same set of distributions on the interior of the probability simplex, and are equal up to closure. We give a semi-algebraic description of the model in terms of six binomial inequalities and obtain closed form expressions for the maximum likelihood estimates. We briefly discuss extensions to larger models.Comment: 18 pages, 7 figure
    corecore