4,266 research outputs found
Gaussian Approximation of Collective Graphical Models
The Collective Graphical Model (CGM) models a population of independent and
identically distributed individuals when only collective statistics (i.e.,
counts of individuals) are observed. Exact inference in CGMs is intractable,
and previous work has explored Markov Chain Monte Carlo (MCMC) and MAP
approximations for learning and inference. This paper studies Gaussian
approximations to the CGM. As the population grows large, we show that the CGM
distribution converges to a multivariate Gaussian distribution (GCGM) that
maintains the conditional independence properties of the original CGM. If the
observations are exact marginals of the CGM or marginals that are corrupted by
Gaussian noise, inference in the GCGM approximation can be computed efficiently
in closed form. If the observations follow a different noise model (e.g.,
Poisson), then expectation propagation provides efficient and accurate
approximate inference. The accuracy and speed of GCGM inference is compared to
the MCMC and MAP methods on a simulated bird migration problem. The GCGM
matches or exceeds the accuracy of the MAP method while being significantly
faster.Comment: Accepted by ICML 2014. 10 page version with appendi
Tractability through Exchangeability: A New Perspective on Efficient Probabilistic Inference
Exchangeability is a central notion in statistics and probability theory. The
assumption that an infinite sequence of data points is exchangeable is at the
core of Bayesian statistics. However, finite exchangeability as a statistical
property that renders probabilistic inference tractable is less
well-understood. We develop a theory of finite exchangeability and its relation
to tractable probabilistic inference. The theory is complementary to that of
independence and conditional independence. We show that tractable inference in
probabilistic models with high treewidth and millions of variables can be
understood using the notion of finite (partial) exchangeability. We also show
that existing lifted inference algorithms implicitly utilize a combination of
conditional independence and partial exchangeability.Comment: In Proceedings of the 28th AAAI Conference on Artificial Intelligenc
- …