3,801 research outputs found
Generalized sequential tree-reweighted message passing
This paper addresses the problem of approximate MAP-MRF inference in general
graphical models. Following [36], we consider a family of linear programming
relaxations of the problem where each relaxation is specified by a set of
nested pairs of factors for which the marginalization constraint needs to be
enforced. We develop a generalization of the TRW-S algorithm [9] for this
problem, where we use a decomposition into junction chains, monotonic w.r.t.
some ordering on the nodes. This generalizes the monotonic chains in [9] in a
natural way. We also show how to deal with nested factors in an efficient way.
Experiments show an improvement over min-sum diffusion, MPLP and subgradient
ascent algorithms on a number of computer vision and natural language
processing problems
Fermions and Loops on Graphs. I. Loop Calculus for Determinant
This paper is the first in the series devoted to evaluation of the partition
function in statistical models on graphs with loops in terms of the
Berezin/fermion integrals. The paper focuses on a representation of the
determinant of a square matrix in terms of a finite series, where each term
corresponds to a loop on the graph. The representation is based on a fermion
version of the Loop Calculus, previously introduced by the authors for
graphical models with finite alphabets. Our construction contains two levels.
First, we represent the determinant in terms of an integral over anti-commuting
Grassman variables, with some reparametrization/gauge freedom hidden in the
formulation. Second, we show that a special choice of the gauge, called BP
(Bethe-Peierls or Belief Propagation) gauge, yields the desired loop
representation. The set of gauge-fixing BP conditions is equivalent to the
Gaussian BP equations, discussed in the past as efficient (linear scaling)
heuristics for estimating the covariance of a sparse positive matrix.Comment: 11 pages, 1 figure; misprints correcte
Barrier Frank-Wolfe for Marginal Inference
We introduce a globally-convergent algorithm for optimizing the
tree-reweighted (TRW) variational objective over the marginal polytope. The
algorithm is based on the conditional gradient method (Frank-Wolfe) and moves
pseudomarginals within the marginal polytope through repeated maximum a
posteriori (MAP) calls. This modular structure enables us to leverage black-box
MAP solvers (both exact and approximate) for variational inference, and obtains
more accurate results than tree-reweighted algorithms that optimize over the
local consistency relaxation. Theoretically, we bound the sub-optimality for
the proposed algorithm despite the TRW objective having unbounded gradients at
the boundary of the marginal polytope. Empirically, we demonstrate the
increased quality of results found by tightening the relaxation over the
marginal polytope as well as the spanning tree polytope on synthetic and
real-world instances.Comment: 25 pages, 12 figures, To appear in Neural Information Processing
Systems (NIPS) 2015, Corrected reference and cleaned up bibliograph
- …