5 research outputs found
An Embarrassingly Simple Speed-Up of Belief Propagation with Robust Potentials
We present an exact method of greatly speeding up belief propagation (BP) for
a wide variety of potential functions in pairwise MRFs and other graphical
models. Specifically, our technique applies whenever the pairwise potentials
have been {\em truncated} to a constant value for most pairs of states, as is
commonly done in MRF models with robust potentials (such as stereo) that impose
an upper bound on the penalty assigned to discontinuities; for each of the
possible states in one node, only a smaller number of compatible states in
a neighboring node are assigned milder penalties. The computational complexity
of our method is , compared with for standard BP, and we
emphasize that the method is {\em exact}, in contrast with related techniques
such as pruning; moreover, the method is very simple and easy to implement.
Unlike some previous work on speeding up BP, our method applies both to
sum-product and max-product BP, which makes it useful in any applications where
marginal probabilities are required, such as maximum likelihood estimation. We
demonstrate the technique on a stereo MRF example, confirming that the
technique speeds up BP without altering the solution.Comment: 10 pages, 3 figure
Low-Complexity Stochastic Generalized Belief Propagation
The generalized belief propagation (GBP), introduced by Yedidia et al., is an
extension of the belief propagation (BP) algorithm, which is widely used in
different problems involved in calculating exact or approximate marginals of
probability distributions. In many problems, it has been observed that the
accuracy of GBP considerably outperforms that of BP. However, because in
general the computational complexity of GBP is higher than BP, its application
is limited in practice.
In this paper, we introduce a stochastic version of GBP called stochastic
generalized belief propagation (SGBP) that can be considered as an extension to
the stochastic BP (SBP) algorithm introduced by Noorshams et al. They have
shown that SBP reduces the complexity per iteration of BP by an order of
magnitude in alphabet size. In contrast to SBP, SGBP can reduce the computation
complexity if certain topological conditions are met by the region graph
associated to a graphical model. However, this reduction can be larger than
only one order of magnitude in alphabet size. In this paper, we characterize
these conditions and the amount of computation gain that we can obtain by using
SGBP. Finally, using similar proof techniques employed by Noorshams et al., for
general graphical models satisfy contraction conditions, we prove the
asymptotic convergence of SGBP to the unique GBP fixed point, as well as
providing non-asymptotic upper bounds on the mean square error and on the high
probability error.Comment: 18 pages, 11 figures, a shorter version of this paper was accepted in
ISIT'1
Belief Propagation for Continuous State Spaces: Stochastic Message-Passing with Quantitative Guarantees
The sum-product or belief propagation (BP) algorithm is a widely used
message-passing technique for computing approximate marginals in graphical
models. We introduce a new technique, called stochastic orthogonal series
message-passing (SOSMP), for computing the BP fixed point in models with
continuous random variables. It is based on a deterministic approximation of
the messages via orthogonal series expansion, and a stochastic approximation
via Monte Carlo estimates of the integral updates of the basis coefficients. We
prove that the SOSMP iterates converge to a \delta-neighborhood of the unique
BP fixed point for any tree-structured graph, and for any graphs with cycles in
which the BP updates satisfy a contractivity condition. In addition, we
demonstrate how to choose the number of basis coefficients as a function of the
desired approximation accuracy \delta and smoothness of the compatibility
functions. We illustrate our theory with both simulated examples and in
application to optical flow estimation.Comment: Portions of the results were presented at the International Symposium
on Information Theory 2012. The results were also submitted to the Journal of
Machine Learning Research on December 16th 201
Stochastic Belief Propagation: A Low-Complexity Alternative to the Sum-Product Algorithm
The sum-product or belief propagation (BP) algorithm is a widely-used
message-passing algorithm for computing marginal distributions in graphical
models with discrete variables. At the core of the BP message updates, when
applied to a graphical model with pairwise interactions, lies a matrix-vector
product with complexity that is quadratic in the state dimension , and
requires transmission of a -dimensional vector of real numbers
(messages) to its neighbors. Since various applications involve very large
state dimensions, such computation and communication complexities can be
prohibitively complex. In this paper, we propose a low-complexity variant of
BP, referred to as stochastic belief propagation (SBP). As suggested by the
name, it is an adaptively randomized version of the BP message updates in which
each node passes randomly chosen information to each of its neighbors. The SBP
message updates reduce the computational complexity (per iteration) from
quadratic to linear in , without assuming any particular structure of the
potentials, and also reduce the communication complexity significantly,
requiring only bits transmission per edge. Moreover, we establish a
number of theoretical guarantees for the performance of SBP, showing that it
converges almost surely to the BP fixed point for any tree-structured graph,
and for graphs with cycles satisfying a contractivity condition. In addition,
for these graphical models, we provide non-asymptotic upper bounds on the
convergence rate, showing that the norm of the error vector
decays no slower than with the number of iterations on
trees and the mean square error decays as for general graphs. These
analysis show that SBP can provably yield reductions in computational and
communication complexities for various classes of graphical models.Comment: Portions of the results were initially reported at the Allerton
Conference on Communications, Control, and Computing (September 2011). The
work was also submitted to IEEE Transaction on Information Theory in November
201
Dynamic Quantization for Belief Propagation in Sparse Spaces Abstract
Graphical models provide an attractive framework for modeling a variety of problems in computer vision. The advent of powerful inference techniques such as belief propagation (BP) has recently made inference with many of these models tractable. Even so, the enormous size of the state spaces required for some applications can create a heavy computational burden. Pruning is a standard technique for reducing this burden, but since pruning is irreversible it carries the risk of greedily deleting important states, which can subsequently result in gross errors in BP. To address this problem, we propose a novel extension of pruning, which we call dynamic quantization (DQ), that allows BP to adaptively add as well as subtract states as needed. We examine DQ in the context of graphical-model based deformable template matching, in which the state space size is on the order of the number of pixels in an image. The combination of BP and DQ yields deformable templates that are both fast and robust to significant occlusions, without requiring any user initialization. Experimental results are shown on deformable templates of planar shapes. Finally, we argue that DQ is applicable to a variety of graphical models in which the state spaces are sparsely populated. Key words: belief propagation, graphical models, pruning, deformable templates