4 research outputs found
Self-Averaging Expectation Propagation
We investigate the problem of approximate Bayesian inference for a general
class of observation models by means of the expectation propagation (EP)
framework for large systems under some statistical assumptions. Our approach
tries to overcome the numerical bottleneck of EP caused by the inversion of
large matrices. Assuming that the measurement matrices are realizations of
specific types of ensembles we use the concept of freeness from random matrix
theory to show that the EP cavity variances exhibit an asymptotic
self-averaging property. They can be pre-computed using specific generating
functions, i.e. the R- and/or S-transforms in free probability, which do not
require matrix inversions. Our approach extends the framework of (generalized)
approximate message passing -- assumes zero-mean iid entries of the measurement
matrix -- to a general class of random matrix ensembles. The generalization is
via a simple formulation of the R- and/or S-transforms of the limiting
eigenvalue distribution of the Gramian of the measurement matrix. We
demonstrate the performance of our approach on a signal recovery problem of
nonlinear compressed sensing and compare it with that of EP.Comment: 12 page
Tree-AMP: Compositional Inference with Tree Approximate Message Passing
We introduce Tree-AMP, standing for Tree Approximate Message Passing, a
python package for compositional inference in high-dimensional tree-structured
models. The package provides a unifying framework to study several approximate
message passing algorithms previously derived for a variety of machine learning
tasks such as generalized linear models, inference in multi-layer networks,
matrix factorization, and reconstruction using non-separable penalties. For
some models, the asymptotic performance of the algorithm can be theoretically
predicted by the state evolution, and the measurements entropy estimated by the
free entropy formalism. The implementation is modular by design: each module,
which implements a factor, can be composed at will with other modules to solve
complex inference tasks. The user only needs to declare the factor graph of the
model: the inference algorithm, state evolution and entropy estimation are
fully automated.Comment: Source code available at https://github.com/sphinxteam/tramp and
documentation at https://sphinxteam.github.io/tramp.doc