16,798 research outputs found

    Entropies from coarse-graining: convex polytopes vs. ellipsoids

    Full text link
    We examine the Boltzmann/Gibbs/Shannon SBGS\mathcal{S}_{BGS} and the non-additive Havrda-Charv\'{a}t / Dar\'{o}czy/Cressie-Read/Tsallis \ Sq\mathcal{S}_q \ and the Kaniadakis κ\kappa-entropy \ Sκ\mathcal{S}_\kappa \ from the viewpoint of coarse-graining, symplectic capacities and convexity. We argue that the functional form of such entropies can be ascribed to a discordance in phase-space coarse-graining between two generally different approaches: the Euclidean/Riemannian metric one that reflects independence and picks cubes as the fundamental cells and the symplectic/canonical one that picks spheres/ellipsoids for this role. Our discussion is motivated by and confined to the behaviour of Hamiltonian systems of many degrees of freedom. We see that Dvoretzky's theorem provides asymptotic estimates for the minimal dimension beyond which these two approaches are close to each other. We state and speculate about the role that dualities may play in this viewpoint.Comment: 63 pages. No figures. Standard LaTe

    Unique additive information measures - Boltzmann-Gibbs-Shannon, Fisher and beyond

    Full text link
    It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Power law equilibrium distributions are found as a result of the interaction of the two terms. The case of second order derivative dependence is investigated and a corresponding additive information measure is given.Comment: 10 pages, 1 figures, shortene

    Some functional equations related to the characterizations of information measures and their stability

    Full text link
    The main purpose of this paper is to investigate the stability problem of some functional equations that appear in the characterization problem of information measures.Comment: 36 pages. arXiv admin note: text overlap with arXiv:1307.0657, arXiv:1307.0631, arXiv:1307.0664, arXiv:1307.065

    Reduced perplexity: Uncertainty measures without entropy

    Full text link
    Conference paper presented at Recent Advances in Info-Metrics, Washington, DC, 2014. Under review for a book chapter in "Recent innovations in info-metrics: a cross-disciplinary perspective on information and information processing" by Oxford University Press.A simple, intuitive approach to the assessment of probabilistic inferences is introduced. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. Thus there is both a quantitative reduction in perplexity as good inference algorithms reduce the uncertainty and a qualitative reduction due to the increased clarity between the original set of inferences and their average, the geometric mean. Further insight is provided by showing that the Renyi and Tsallis entropy functions translated to the probability domain are both the weighted generalized mean of the distribution. The generalized mean of probabilistic inferences forms a Risk Profile of the performance. The arithmetic mean is used to measure the decisiveness, while the -2/3 mean is used to measure the robustness
    • …
    corecore