65,822 research outputs found
On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means
The Jensen-Shannon divergence is a renown bounded symmetrization of the
unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler
divergence to the average mixture distribution. However the Jensen-Shannon
divergence between Gaussian distributions is not available in closed-form. To
bypass this problem, we present a generalization of the Jensen-Shannon (JS)
divergence using abstract means which yields closed-form expressions when the
mean is chosen according to the parametric family of distributions. More
generally, we define the JS-symmetrizations of any distance using generalized
statistical mixtures derived from abstract means. In particular, we first show
that the geometric mean is well-suited for exponential families, and report two
closed-form formula for (i) the geometric Jensen-Shannon divergence between
probability densities of the same exponential family, and (ii) the geometric
JS-symmetrization of the reverse Kullback-Leibler divergence. As a second
illustrating example, we show that the harmonic mean is well-suited for the
scale Cauchy distributions, and report a closed-form formula for the harmonic
Jensen-Shannon divergence between scale Cauchy distributions. We also define
generalized Jensen-Shannon divergences between matrices (e.g., quantum
Jensen-Shannon divergences) and consider clustering with respect to these novel
Jensen-Shannon divergences.Comment: 30 page
A note on quantum chaology and gamma approximations to eigenvalue spacings for infinite random matrices
Quantum counterparts of certain simple classical systems can exhibit chaotic
behaviour through the statistics of their energy levels and the irregular
spectra of chaotic systems are modelled by eigenvalues of infinite random
matrices. We use known bounds on the distribution function for eigenvalue
spacings for the Gaussian orthogonal ensemble (GOE) of infinite random real
symmetric matrices and show that gamma distributions, which have an important
uniqueness property, can yield an approximation to the GOE distribution. That
has the advantage that then both chaotic and non chaotic cases fit in the
information geometric framework of the manifold of gamma distributions, which
has been the subject of recent work on neighbourhoods of randomness for general
stochastic systems. Additionally, gamma distributions give approximations, to
eigenvalue spacings for the Gaussian unitary ensemble (GUE) of infinite random
hermitian matrices and for the Gaussian symplectic ensemble (GSE) of infinite
random hermitian matrices with real quaternionic elements, except near the
origin. Gamma distributions do not precisely model the various analytic systems
discussed here, but some features may be useful in studies of qualitative
generic properties in applications to data from real systems which manifestly
seem to exhibit behaviour reminiscent of near-random processes.Comment: 9 pages, 5 figures, 2 tables, 27 references. Updates version 1 with
data and references from feedback receive
Geodesics on the manifold of multivariate generalized Gaussian distributions with an application to multicomponent texture discrimination
We consider the Rao geodesic distance (GD) based on the Fisher information as a similarity measure on the manifold of zero-mean multivariate generalized Gaussian distributions (MGGD). The MGGD is shown to be an adequate model for the heavy-tailed wavelet statistics in multicomponent images, such as color or multispectral images. We discuss the estimation of MGGD parameters using various methods. We apply the GD between MGGDs to color texture discrimination in several classification experiments, taking into account the correlation structure between the spectral bands in the wavelet domain. We compare the performance, both in terms of texture discrimination capability and computational load, of the GD and the Kullback-Leibler divergence (KLD). Likewise, both uni- and multivariate generalized Gaussian models are evaluated, characterized by a fixed or a variable shape parameter. The modeling of the interband correlation significantly improves classification efficiency, while the GD is shown to consistently outperform the KLD as a similarity measure
A simple probabilistic construction yielding generalized entropies and divergences, escort distributions and q-Gaussians
We give a simple probabilistic description of a transition between two states
which leads to a generalized escort distribution. When the parameter of the
distribution varies, it defines a parametric curve that we call an escort-path.
The R\'enyi divergence appears as a natural by-product of the setting. We study
the dynamics of the Fisher information on this path, and show in particular
that the thermodynamic divergence is proportional to Jeffreys' divergence.
Next, we consider the problem of inferring a distribution on the escort-path,
subject to generalized moments constraints. We show that our setting naturally
induces a rationale for the minimization of the R\'enyi information divergence.
Then, we derive the optimum distribution as a generalized q-Gaussian
distribution
- …