31,453 research outputs found

    Income Distributions and Decomposable Divergence Measures

    Get PDF
    Inequality indices (i) evaluate the divergence between the income distribution and the hypothetical situation where all individuals have the mean income and (ii) are unambiguously reduced by a Pigou-Dalton progressive transfer. This paper proposes a new approach to evaluate the divergence between any two income distributions, where the second one can be a reference distribution for the first. In the case where the reference distribution is perfectly egalitarian, and uniquely in this case, we assume (i) that any progressive transfer reduces the divergence and (ii) that the divergence can be additively separated between inequality and efficiency loss. We characterize the unique class of decomposable divergence measures consistent with these views, and we derive the associated relative (resp. absolute) subclasses, which express constant relative (resp. absolute) inequality aversion. This approach extends the generalized entropy studied in inequality measurement.Inequality measures, progressive transfers, generalized entropy, information theory, Bregman divergences

    Entropy: The Markov Ordering Approach

    Full text link
    The focus of this article is on entropy and Markov processes. We study the properties of functionals which are invariant with respect to monotonic transformations and analyze two invariant "additivity" properties: (i) existence of a monotonic transformation which makes the functional additive with respect to the joining of independent systems and (ii) existence of a monotonic transformation which makes the functional additive with respect to the partitioning of the space of states. All Lyapunov functionals for Markov chains which have properties (i) and (ii) are derived. We describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the {\em Markov order}). The solution differs significantly from the ordering given by the inequality of entropy growth. For inference, this approach results in a convex compact set of conditionally "most random" distributions.Comment: 50 pages, 4 figures, Postprint version. More detailed discussion of the various entropy additivity properties and separation of variables for independent subsystems in MaxEnt problem is added in Section 4.2. Bibliography is extende

    Escort Evolutionary Game Theory

    Full text link
    A family of replicator-like dynamics, called the escort replicator equation, is constructed using information-geometric concepts and generalized information entropies and diverenges from statistical thermodynamics. Lyapunov functions and escort generalizations of basic concepts and constructions in evolutionary game theory are given, such as an escorted Fisher's Fundamental theorem and generalizations of the Shahshahani geometry.Comment: Minor typo correctio

    On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

    Full text link
    The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian distributions is not available in closed-form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using generalized statistical mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family, and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. We also define generalized Jensen-Shannon divergences between matrices (e.g., quantum Jensen-Shannon divergences) and consider clustering with respect to these novel Jensen-Shannon divergences.Comment: 30 page

    Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states

    Full text link
    We discuss an alternative to relative entropy as a measure of distance between mixed quantum states. The proposed quantity is an extension to the realm of quantum theory of the Jensen-Shannon divergence (JSD) between probability distributions. The JSD has several interesting properties. It arises in information theory and, unlike the Kullback-Leibler divergence, it is symmetric, always well defined and bounded. We show that the quantum JSD (QJSD) shares with the relative entropy most of the physically relevant properties, in particular those required for a "good" quantum distinguishability measure. We relate it to other known quantum distances and we suggest possible applications in the field of the quantum information theory.Comment: 14 pages, corrected equation 1
    • 

    corecore