431 research outputs found

    On some interrelations of generalized qq-entropies and a generalized Fisher information, including a Cram\'er-Rao inequality

    Get PDF
    In this communication, we describe some interrelations between generalized qq-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information naturally pops up in the expression of the derivative of the Tsallis entropy. This generalized Fisher information also appears as a special case of a generalized Fisher information for estimation problems. Indeed, we derive here a new Cram\'er-Rao inequality for the estimation of a parameter, which involves a generalized form of Fisher information. This generalized Fisher information reduces to the standard Fisher information as a particular case. In the case of a translation parameter, the general Cram\'er-Rao inequality leads to an inequality for distributions which is saturated by generalized qq-Gaussian distributions. These generalized qq-Gaussians are important in several areas of physics and mathematics. They are known to maximize the qq-entropies subject to a moment constraint. The Cram\'er-Rao inequality shows that the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a fixed moment. Similarly, the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a given qq-entropy

    Source coding with escort distributions and Renyi entropy bounds

    Get PDF
    We discuss the interest of escort distributions and R\'enyi entropy in the context of source coding. We first recall a source coding theorem by Campbell relating a generalized measure of length to the R\'enyi-Tsallis entropy. We show that the associated optimal codes can be obtained using considerations on escort-distributions. We propose a new family of measure of length involving escort-distributions and we show that these generalized lengths are also bounded below by the R\'enyi entropy. Furthermore, we obtain that the standard Shannon codes lengths are optimum for the new generalized lengths measures, whatever the entropic index. Finally, we show that there exists in this setting an interplay between standard and escort distributions

    An amended MaxEnt formulation for deriving Tsallis factors, and associated issues

    Get PDF
    An amended MaxEnt formulation for systems displaced from the conventional MaxEnt equilibrium is proposed. This formulation involves the minimization of the Kullback-Leibler divergence to a reference QQ (or maximization of Shannon QQ-entropy), subject to a constraint that implicates a second reference distribution P_1P\_{1} and tunes the new equilibrium. In this setting, the equilibrium distribution is the generalized escort distribution associated to P_1P\_{1} and QQ. The account of an additional constraint, an observable given by a statistical mean, leads to the maximization of R\'{e}nyi/Tsallis QQ-entropy subject to that constraint. Two natural scenarii for this observation constraint are considered, and the classical and generalized constraint of nonextensive statistics are recovered. The solutions to the maximization of R\'{e}nyi QQ-entropy subject to the two types of constraints are derived. These optimum distributions, that are Levy-like distributions, are self-referential. We then propose two `alternate' (but effectively computable) dual functions, whose maximizations enable to identify the optimum parameters. Finally, a duality between solutions and the underlying Legendre structure are presented.Comment: Presented at MaxEnt2006, Paris, France, july 10-13, 200

    A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians

    Get PDF
    We give a simple probabilistic description of a transition between two states which leads to a generalized escort distribution. When the parameter of the distribution varies, it defines a parametric curve that we call an escort-path. The R\'enyi divergence appears as a natural by-product of the setting. We study the dynamics of the Fisher information on this path, and show in particular that the thermodynamic divergence is proportional to Jeffreys' divergence. Next, we consider the problem of inferring a distribution on the escort-path, subject to generalized moments constraints. We show that our setting naturally induces a rationale for the minimization of the R\'enyi information divergence. Then, we derive the optimum distribution as a generalized q-Gaussian distribution

    Quelques inégalités caractérisant les gaussiennes généralisées

    Get PDF
    International audienceDans cet article, on propose de caractériser les gaussiennes généralisées comme les densités atteignant les bornes de plusieurs extensions d'inégalités connues en théorie de l'information. On retrouve en cas particulier les résultats pour la gaussienne standard. Abstract - In this paper, we propose to characterize a class of generalized Gaussian distributions as the densities that saturate several extensions of classical information theoretic inequalities. The results for the standard Gaussian are recovered as a particular case

    On escort distributions, q-gaussians and Fisher information

    Get PDF
    International audienceEscort distributions are a simple one parameter deformation of an original distribution p. In Tsallis extended thermostatistics, the escort‐averages, defined with respect to an escort distribution, have revealed useful in order to obtain analytical results and variational equations, with in particular the equilibrium distributions obtained as maxima of RĂ©nyi‐Tsallis entropy subject to constraints in the form of a q‐average. A central example is the q‐gaussian, which is a generalization of the standard gaussian distribution. In this contribution, we show that escort distributions emerge naturally as a maximum entropy trade‐off between the distribution p(x) and the uniform distribution. This setting may typically describe a phase transition between two states. But escort distributions also appear in the fields of multifractal analysis, quantization and coding with interesting consequences. For the problem of coding, we recall a source coding theorem by Campbell relating a generalized measure of length to the RĂ©nyi‐Tsallis entropy and exhibit the links with escort distributions together with pratical implications. That q‐gaussians arise from the maximization of RĂ©nyi‐Tsallis entropy subject to a q‐variance constraint is a known fact. We show here that the (squared) q‐gaussian also appear as a minimum of Fisher information subject to the same q‐variance constraint

    Escort entropies and divergences and related canonical distribution

    Get PDF
    arXiv : 1109.3311International audienceWe discuss two families of two-parameter entropies and divergences, derived from the standard RĂ©nyi and Tsallis entropies and divergences. These divergences and entropies are found as divergences or entropies of escort distributions. Exploiting the nonnegativity of the divergences, we derive the expression of the canonical distribution associated to the new entropies and a observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints

    Entropies et critĂšres entropiques

    Get PDF
    International audienceCe chapitre est centrĂ© sur les notions d'entropies et de lois Ă  maximum d'entropie qui seront caractĂ©risĂ©es selon plusieurs angles. Au delĂ  des liens avec les applications en ingĂ©nierie puis en physique, on montrera qu'il est possible de bĂątir des fonction-nelles rĂ©gularisantes fondĂ©es sur l'emploi d'une technique Ă  maximum d'entropie, qui peuvent alors Ă©ventuellement ĂȘtre utilisĂ©es comme potentiels ad hoc dans des pro-blĂšmes d'inversion de donnĂ©es. Le chapitre dĂ©bute par un tour d'horizon des principales propriĂ©tĂ©s des mesures d'information, et par l'introduction de diffĂ©rentes notions et dĂ©finitions. En particu-lier, on dĂ©finit la divergence de RĂ©nyi, on prĂ©sente la notion de distribution escorte, et on commente le principe du maximum d'entropie qui sera utilisĂ© par la suite. On prĂ©sente ensuite un problĂšme classique d'ingĂ©nierie, le problĂšme du codage de source, et on montre l'intĂ©rĂȘt d'utiliser des mesures de longueur diffĂ©rentes de la mesure stan-dard, et en particulier une mesure exponentielle, qui conduit Ă  un thĂ©orĂšme de codage de source dont la borne minimale est une entropie de RĂ©nyi. On montre Ă©galement que les codes optimaux peuvent ĂȘtre calculĂ©s aisĂ©ment grĂące aux distributions escortes. En section 1.4, on introduit et on Ă©tudie un modĂšle simple de transition d'Ă©tat. Ce modĂšle conduit Ă  une distribution d'Ă©quilibre dĂ©finie comme une distribution escorte gĂ©nĂ©ra-lisĂ©e, et conduit en sous-produit, Ă  nouveau Ă  une entropie de RĂ©nyi. On Ă©tudie le flux d'information de Fisher le long de la courbe dĂ©finie par la distribution escorte gĂ©nĂ©-ralisĂ©e, et on obtient des connections avec la divergence de Jeffreys. Finalement, on obtient diffĂ©rents arguments qui, dans ce cadre, conduisent Ă  une mĂ©thode d'infĂ©rence fondĂ©e sur la minimisation de l'entropie de RĂ©nyi sous une contrainte de moyenne gĂ©nĂ©ralisĂ©e, i.e. prise vis-Ă -vis de la distribution escorte. À partir de la section 1.5.3, on s'intĂ©resse alors Ă  la minimisation de la divergence de RĂ©nyi sous une contraint
    • 

    corecore