213 research outputs found
Comparing Two Integral Means for Absolutely Continuous Mappings Whose Derivatives are in L∞[a,b] and Applications
Estimates of the difference of two integral means on [a,b], [c,d] with [c,d] ⊂ [a,b] in terms of the sup norm of the derivative and applications for pdfs, special means, Jeffreys’ divergence and continuous streams are given
A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians
We give a simple probabilistic description of a transition between two states
which leads to a generalized escort distribution. When the parameter of the
distribution varies, it defines a parametric curve that we call an escort-path.
The R\'enyi divergence appears as a natural by-product of the setting. We study
the dynamics of the Fisher information on this path, and show in particular
that the thermodynamic divergence is proportional to Jeffreys' divergence.
Next, we consider the problem of inferring a distribution on the escort-path,
subject to generalized moments constraints. We show that our setting naturally
induces a rationale for the minimization of the R\'enyi information divergence.
Then, we derive the optimum distribution as a generalized q-Gaussian
distribution
On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means
The Jensen-Shannon divergence is a renown bounded symmetrization of the
unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler
divergence to the average mixture distribution. However the Jensen-Shannon
divergence between Gaussian distributions is not available in closed-form. To
bypass this problem, we present a generalization of the Jensen-Shannon (JS)
divergence using abstract means which yields closed-form expressions when the
mean is chosen according to the parametric family of distributions. More
generally, we define the JS-symmetrizations of any distance using generalized
statistical mixtures derived from abstract means. In particular, we first show
that the geometric mean is well-suited for exponential families, and report two
closed-form formula for (i) the geometric Jensen-Shannon divergence between
probability densities of the same exponential family, and (ii) the geometric
JS-symmetrization of the reverse Kullback-Leibler divergence. As a second
illustrating example, we show that the harmonic mean is well-suited for the
scale Cauchy distributions, and report a closed-form formula for the harmonic
Jensen-Shannon divergence between scale Cauchy distributions. We also define
generalized Jensen-Shannon divergences between matrices (e.g., quantum
Jensen-Shannon divergences) and consider clustering with respect to these novel
Jensen-Shannon divergences.Comment: 30 page
Tight Bounds for Symmetric Divergence Measures and a New Inequality Relating -Divergences
Tight bounds for several symmetric divergence measures are introduced, given
in terms of the total variation distance. Each of these bounds is attained by a
pair of 2 or 3-element probability distributions. An application of these
bounds for lossless source coding is provided, refining and improving a certain
bound by Csisz\'ar. A new inequality relating -divergences is derived, and
its use is exemplified. The last section of this conference paper is not
included in the recent journal paper that was published in the February 2015
issue of the IEEE Trans. on Information Theory (see arXiv:1403.7164), as well
as some new paragraphs throughout the paper which are linked to new references.Comment: A final version of the conference paper at the 2015 IEEE Information
Theory Workshop, Jerusalem, Israe
- …