153,806 research outputs found
Expectation Propagation for Poisson Data
The Poisson distribution arises naturally when dealing with data involving
counts, and it has found many applications in inverse problems and imaging. In
this work, we develop an approximate Bayesian inference technique based on
expectation propagation for approximating the posterior distribution formed
from the Poisson likelihood function and a Laplace type prior distribution,
e.g., the anisotropic total variation prior. The approach iteratively yields a
Gaussian approximation, and at each iteration, it updates the Gaussian
approximation to one factor of the posterior distribution by moment matching.
We derive explicit update formulas in terms of one-dimensional integrals, and
also discuss stable and efficient quadrature rules for evaluating these
integrals. The method is showcased on two-dimensional PET images.Comment: 25 pages, to be published at Inverse Problem
Bounding errors of Expectation-Propagation
Expectation Propagation is a very popular algorithm for variational
inference, but comes with few theoretical guarantees. In this article, we prove
that the approximation errors made by EP can be bounded. Our bounds have an
asymptotic interpretation in the number of datapoints, which allows us to
study EP's convergence with respect to the true posterior. In particular, we
show that EP converges at a rate of for the mean, up to
an order of magnitude faster than the traditional Gaussian approximation at the
mode. We also give similar asymptotic expansions for moments of order 2 to 4,
as well as excess Kullback-Leibler cost (defined as the additional KL cost
incurred by using EP rather than the ideal Gaussian approximation). All these
expansions highlight the superior convergence properties of EP. Our approach
for deriving those results is likely applicable to many similar approximate
inference methods. In addition, we introduce bounds on the moments of
log-concave distributions that may be of independent interest.Comment: Accepted and published at NIPS 201
Compressed sensing reconstruction using Expectation Propagation
Many interesting problems in fields ranging from telecommunications to
computational biology can be formalized in terms of large underdetermined
systems of linear equations with additional constraints or regularizers. One of
the most studied ones, the Compressed Sensing problem (CS), consists in finding
the solution with the smallest number of non-zero components of a given system
of linear equations for known
measurement vector and sensing matrix . Here, we
will address the compressed sensing problem within a Bayesian inference
framework where the sparsity constraint is remapped into a singular prior
distribution (called Spike-and-Slab or Bernoulli-Gauss). Solution to the
problem is attempted through the computation of marginal distributions via
Expectation Propagation (EP), an iterative computational scheme originally
developed in Statistical Physics. We will show that this strategy is
comparatively more accurate than the alternatives in solving instances of CS
generated from statistically correlated measurement matrices. For computational
strategies based on the Bayesian framework such as variants of Belief
Propagation, this is to be expected, as they implicitly rely on the hypothesis
of statistical independence among the entries of the sensing matrix. Perhaps
surprisingly, the method outperforms uniformly also all the other
state-of-the-art methods in our tests.Comment: 20 pages, 6 figure
- …