310 research outputs found

    The information-theoretic meaning of Gagliardo--Nirenberg type inequalities

    Full text link
    Gagliardo--Nirenberg inequalities are interpolation inequalities which were proved independently by Gagliardo and Nirenberg in the late fifties. In recent years, their connections with theoretic aspects of information theory and nonlinear diffusion equations allowed to obtain some of them in optimal form, by recovering both the sharp constants and the explicit form of the optimizers. In this note, at the light of these recent researches, we review the main connections between Shannon-type entropies, diffusion equations and a class of these inequalities

    Divergence Measures

    Get PDF
    Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures

    Two Measures of Dependence

    Full text link
    Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order α\alpha and the relative α\alpha-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α\alpha is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.Comment: 40 pages; 1 figure; published in Entrop

    Distributed Task Encoding

    Full text link
    The rate region of the task-encoding problem for two correlated sources is characterized using a novel parametric family of dependence measures. The converse uses a new expression for the ρ\rho-th moment of the list size, which is derived using the relative α\alpha-entropy.Comment: 5 pages; accepted at ISIT 201

    Ensemble estimation of multivariate f-divergence

    Full text link
    f-divergence estimation is an important problem in the fields of information theory, machine learning, and statistics. While several divergence estimators exist, relatively few of their convergence rates are known. We derive the MSE convergence rate for a density plug-in estimator of f-divergence. Then by applying the theory of optimally weighted ensemble estimation, we derive a divergence estimator with a convergence rate of O(1/T) that is simple to implement and performs well in high dimensions. We validate our theoretical results with experiments.Comment: 14 pages, 6 figures, a condensed version of this paper was accepted to ISIT 2014, Version 2: Moved the proofs of the theorems from the main body to appendices at the en
    corecore