71 research outputs found

    Closures of exponential families

    Full text link
    The variation distance closure of an exponential family with a convex set of canonical parameters is described, assuming no regularity conditions. The tools are the concepts of convex core of a measure and extension of an exponential family, introduced previously by the authors, and a new concept of accessible faces of a convex set. Two other closures related to the information divergence are also characterized.Comment: Published at http://dx.doi.org/10.1214/009117904000000766 in the Annals of Probability (http://www.imstat.org/aop/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A Bit of Information Theory, and the Data Augmentation Algorithm Converges

    Full text link
    The data augmentation (DA) algorithm is a simple and powerful tool in statistical computing. In this note basic information theory is used to prove a nontrivial convergence theorem for the DA algorithm

    Greedy Algorithms for Optimal Distribution Approximation

    Full text link
    The approximation of a discrete probability distribution t\mathbf{t} by an MM-type distribution p\mathbf{p} is considered. The approximation error is measured by the informational divergence D(tp)\mathbb{D}(\mathbf{t}\Vert\mathbf{p}), which is an appropriate measure, e.g., in the context of data compression. Properties of the optimal approximation are derived and bounds on the approximation error are presented, which are asymptotically tight. It is shown that MM-type approximations that minimize either D(tp)\mathbb{D}(\mathbf{t}\Vert\mathbf{p}), or D(pt)\mathbb{D}(\mathbf{p}\Vert\mathbf{t}), or the variational distance pt1\Vert\mathbf{p}-\mathbf{t}\Vert_1 can all be found by using specific instances of the same general greedy algorithm.Comment: 5 page

    Further Results on Geometric Properties of a Family of Relative Entropies

    Full text link
    This paper extends some geometric properties of a one-parameter family of relative entropies. These arise as redundancies when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the Kullback-Leibler divergence. They satisfy the Pythagorean property and behave like squared distances. This property, which was known for finite alphabet spaces, is now extended for general measure spaces. Existence of projections onto convex and certain closed sets is also established. Our results may have applications in the R\'enyi entropy maximization rule of statistical physics.Comment: 7 pages, Prop. 5 modified, in Proceedings of the 2011 IEEE International Symposium on Information Theor

    On mutual information, likelihood-ratios and estimation error for the additive Gaussian channel

    Full text link
    This paper considers the model of an arbitrary distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean square error of the non-causal estimator and the likelihood ratio between y and \omega are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean square error. These results are applied to derive infinite dimensional versions of the Fisher information and the de Bruijn identity. The derivation of the results is based on the Malliavin calculus.Comment: 21 pages, to appear in the IEEE Transactions on Information Theor

    Rational approximations of spectral densities based on the Alpha divergence

    Full text link
    We approximate a given rational spectral density by one that is consistent with prescribed second-order statistics. Such an approximation is obtained by minimizing a suitable distance from the given spectrum and under the constraints corresponding to imposing the given second-order statistics. Here, we consider the Alpha divergence family as a distance measure. We show that the corresponding approximation problem leads to a family of rational solutions. Secondly, such a family contains the solution which generalizes the Kullback-Leibler solution proposed by Georgiou and Lindquist in 2003. Finally, numerical simulations suggest that this family contains solutions close to the non-rational solution given by the principle of minimum discrimination information.Comment: to appear in the Mathematics of Control, Signals, and System

    Maximizing the divergence from a hierarchical model of quantum states

    Full text link
    We study many-party correlations quantified in terms of the Umegaki relative entropy (divergence) from a Gibbs family known as a hierarchical model. We derive these quantities from the maximum-entropy principle which was used earlier to define the closely related irreducible correlation. We point out differences between quantum states and probability vectors which exist in hierarchical models, in the divergence from a hierarchical model and in local maximizers of this divergence. The differences are, respectively, missing factorization, discontinuity and reduction of uncertainty. We discuss global maximizers of the mutual information of separable qubit states.Comment: 18 pages, 1 figure, v2: improved exposition, v3: less typo
    corecore