3,359 research outputs found
Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities
Information-theoretic measures such as the entropy, cross-entropy and the
Kullback-Leibler divergence between two mixture models is a core primitive in
many signal processing tasks. Since the Kullback-Leibler divergence of mixtures
provably does not admit a closed-form formula, it is in practice either
estimated using costly Monte-Carlo stochastic integration, approximated, or
bounded using various techniques. We present a fast and generic method that
builds algorithmically closed-form lower and upper bounds on the entropy, the
cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate
the versatile method by reporting on our experiments for approximating the
Kullback-Leibler divergence between univariate exponential mixtures, Gaussian
mixtures, Rayleigh mixtures, and Gamma mixtures.Comment: 20 pages, 3 figure
A variational approach to moment-closure approximations for the kinetics of biomolecular reaction networks
Approximate solutions of the chemical master equation and the chemical
Fokker-Planck equation are an important tool in the analysis of biomolecular
reaction networks. Previous studies have highlighted a number of problems with
the moment-closure approach used to obtain such approximations, calling it an
ad-hoc method. In this article, we give a new variational derivation of
moment-closure equations which provides us with an intuitive understanding of
their properties and failure modes and allows us to correct some of these
problems. We use mixtures of product-Poisson distributions to obtain a flexible
parametric family which solves the commonly observed problem of divergences at
low system sizes. We also extend the recently introduced entropic matching
approach to arbitrary ansatz distributions and Markov processes, demonstrating
that it is a special case of variational moment closure. This provides us with
a particularly principled approximation method. Finally, we extend the above
approaches to cover the approximation of multi-time joint distributions,
resulting in a viable alternative to process-level approximations which are
often intractable.Comment: Minor changes and clarifications; corrected some typo
Moment Closure - A Brief Review
Moment closure methods appear in myriad scientific disciplines in the
modelling of complex systems. The goal is to achieve a closed form of a large,
usually even infinite, set of coupled differential (or difference) equations.
Each equation describes the evolution of one "moment", a suitable
coarse-grained quantity computable from the full state space. If the system is
too large for analytical and/or numerical methods, then one aims to reduce it
by finding a moment closure relation expressing "higher-order moments" in terms
of "lower-order moments". In this brief review, we focus on highlighting how
moment closure methods occur in different contexts. We also conjecture via a
geometric explanation why it has been difficult to rigorously justify many
moment closure approximations although they work very well in practice.Comment: short survey paper (max 20 pages) for a broad audience in
mathematics, physics, chemistry and quantitative biolog
Estimating Mixture Entropy with Pairwise Distances
Mixture distributions arise in many parametric and non-parametric settings --
for example, in Gaussian mixture models and in non-parametric estimation. It is
often necessary to compute the entropy of a mixture, but, in most cases, this
quantity has no closed-form expression, making some form of approximation
necessary. We propose a family of estimators based on a pairwise distance
function between mixture components, and show that this estimator class has
many attractive properties. For many distributions of interest, the proposed
estimators are efficient to compute, differentiable in the mixture parameters,
and become exact when the mixture components are clustered. We prove this
family includes lower and upper bounds on the mixture entropy. The Chernoff
-divergence gives a lower bound when chosen as the distance function,
with the Bhattacharyya distance providing the tightest lower bound for
components that are symmetric and members of a location family. The
Kullback-Leibler divergence gives an upper bound when used as the distance
function. We provide closed-form expressions of these bounds for mixtures of
Gaussians, and discuss their applications to the estimation of mutual
information. We then demonstrate that our bounds are significantly tighter than
well-known existing bounds using numeric simulations. This estimator class is
very useful in optimization problems involving maximization/minimization of
entropy and mutual information, such as MaxEnt and rate distortion problems.Comment: Corrects several errata in published version, in particular in
Section V (bounds on mutual information
Probabilistic Framework for Sensor Management
A probabilistic sensor management framework is introduced, which maximizes the utility of sensor systems with many different sensing modalities by dynamically configuring the sensor system in the most beneficial way. For this purpose, techniques from stochastic control and Bayesian estimation are combined such that long-term effects of possible sensor configurations and stochastic uncertainties resulting from noisy measurements can be incorporated into the sensor management decisions
A variational approach to path estimation and parameter inference of hidden diffusion processes
We consider a hidden Markov model, where the signal process, given by a
diffusion, is only indirectly observed through some noisy measurements. The
article develops a variational method for approximating the hidden states of
the signal process given the full set of observations. This, in particular,
leads to systematic approximations of the smoothing densities of the signal
process. The paper then demonstrates how an efficient inference scheme, based
on this variational approach to the approximation of the hidden states, can be
designed to estimate the unknown parameters of stochastic differential
equations. Two examples at the end illustrate the efficacy and the accuracy of
the presented method.Comment: 37 pages, 2 figures, revise
- …