7,128 research outputs found
An amended MaxEnt formulation for deriving Tsallis factors, and associated issues
An amended MaxEnt formulation for systems displaced from the conventional
MaxEnt equilibrium is proposed. This formulation involves the minimization of
the Kullback-Leibler divergence to a reference (or maximization of Shannon
-entropy), subject to a constraint that implicates a second reference
distribution and tunes the new equilibrium. In this setting, the
equilibrium distribution is the generalized escort distribution associated to
and . The account of an additional constraint, an observable given
by a statistical mean, leads to the maximization of R\'{e}nyi/Tsallis
-entropy subject to that constraint. Two natural scenarii for this
observation constraint are considered, and the classical and generalized
constraint of nonextensive statistics are recovered. The solutions to the
maximization of R\'{e}nyi -entropy subject to the two types of constraints
are derived. These optimum distributions, that are Levy-like distributions, are
self-referential. We then propose two `alternate' (but effectively computable)
dual functions, whose maximizations enable to identify the optimum parameters.
Finally, a duality between solutions and the underlying Legendre structure are
presented.Comment: Presented at MaxEnt2006, Paris, France, july 10-13, 200
-MLE: A fast algorithm for learning statistical mixture models
We describe -MLE, a fast and efficient local search algorithm for learning
finite statistical mixtures of exponential families such as Gaussian mixture
models. Mixture models are traditionally learned using the
expectation-maximization (EM) soft clustering technique that monotonically
increases the incomplete (expected complete) likelihood. Given prescribed
mixture weights, the hard clustering -MLE algorithm iteratively assigns data
to the most likely weighted component and update the component models using
Maximum Likelihood Estimators (MLEs). Using the duality between exponential
families and Bregman divergences, we prove that the local convergence of the
complete likelihood of -MLE follows directly from the convergence of a dual
additively weighted Bregman hard clustering. The inner loop of -MLE can be
implemented using any -means heuristic like the celebrated Lloyd's batched
or Hartigan's greedy swap updates. We then show how to update the mixture
weights by minimizing a cross-entropy criterion that implies to update weights
by taking the relative proportion of cluster points, and reiterate the mixture
parameter update and mixture weight update processes until convergence. Hard EM
is interpreted as a special case of -MLE when both the component update and
the weight update are performed successively in the inner loop. To initialize
-MLE, we propose -MLE++, a careful initialization of -MLE guaranteeing
probabilistically a global bound on the best possible complete likelihood.Comment: 31 pages, Extend preliminary paper presented at IEEE ICASSP 201
Extropy: Complementary Dual of Entropy
This article provides a completion to theories of information based on
entropy, resolving a longstanding question in its axiomatization as proposed by
Shannon and pursued by Jaynes. We show that Shannon's entropy function has a
complementary dual function which we call "extropy." The entropy and the
extropy of a binary distribution are identical. However, the measure bifurcates
into a pair of distinct measures for any quantity that is not merely an event
indicator. As with entropy, the maximum extropy distribution is also the
uniform distribution, and both measures are invariant with respect to
permutations of their mass functions. However, they behave quite differently in
their assessments of the refinement of a distribution, the axiom which
concerned Shannon and Jaynes. Their duality is specified via the relationship
among the entropies and extropies of course and fine partitions. We also
analyze the extropy function for densities, showing that relative extropy
constitutes a dual to the Kullback-Leibler divergence, widely recognized as the
continuous entropy measure. These results are unified within the general
structure of Bregman divergences. In this context they identify half the
metric as the extropic dual to the entropic directed distance. We describe a
statistical application to the scoring of sequential forecast distributions
which provoked the discovery.Comment: Published at http://dx.doi.org/10.1214/14-STS430 in the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
On some entropy functionals derived from R\'enyi information divergence
We consider the maximum entropy problems associated with R\'enyi -entropy,
subject to two kinds of constraints on expected values. The constraints
considered are a constraint on the standard expectation, and a constraint on
the generalized expectation as encountered in nonextensive statistics. The
optimum maximum entropy probability distributions, which can exhibit a
power-law behaviour, are derived and characterized. The R\'enyi entropy of the
optimum distributions can be viewed as a function of the constraint. This
defines two families of entropy functionals in the space of possible expected
values. General properties of these functionals, including nonnegativity,
minimum, convexity, are documented. Their relationships as well as numerical
aspects are also discussed. Finally, we work out some specific cases for the
reference measure and recover in a limit case some well-known entropies
Cusp Anomalous dimension and rotating open strings in AdS/CFT
In the context of AdS/CFT we provide analytical support for the proposed
duality between a Wilson loop with a cusp, the cusp anomalous dimension, and
the meson model constructed from a rotating open string with high angular
momentum. This duality was previously studied using numerical tools in [1]. Our
result implies that the minimum of the profile function of the minimal area
surface dual to the Wilson loop, is related to the inverse of the bulk
penetration of the dual string that hangs from the quark--anti-quark pair
(meson) in the gauge theory.Comment: enhanced text, fixed tipos, reference added. Same results and
conclusions. arXiv admin note: text overlap with arXiv:1405.7388 by other
author
- …