4,502 research outputs found
Projecting Markov Random Field Parameters for Fast Mixing
Markov chain Monte Carlo (MCMC) algorithms are simple and extremely powerful
techniques to sample from almost arbitrary distributions. The flaw in practice
is that it can take a large and/or unknown amount of time to converge to the
stationary distribution. This paper gives sufficient conditions to guarantee
that univariate Gibbs sampling on Markov Random Fields (MRFs) will be fast
mixing, in a precise sense. Further, an algorithm is given to project onto this
set of fast-mixing parameters in the Euclidean norm. Following recent work, we
give an example use of this to project in various divergence measures,
comparing univariate marginals obtained by sampling after projection to common
variational methods and Gibbs sampling on the original parameters.Comment: Neural Information Processing Systems 201
Projecting Ising Model Parameters for Fast Mixing
Inference in general Ising models is difficult, due to high treewidth making
tree-based algorithms intractable. Moreover, when interactions are strong,
Gibbs sampling may take exponential time to converge to the stationary
distribution. We present an algorithm to project Ising model parameters onto a
parameter set that is guaranteed to be fast mixing, under several divergences.
We find that Gibbs sampling using the projected parameters is more accurate
than with the original parameters when interaction strengths are strong and
when limited time is available for sampling.Comment: Advances in Neural Information Processing Systems 201
Maximum Likelihood Learning With Arbitrary Treewidth via Fast-Mixing Parameter Sets
Inference is typically intractable in high-treewidth undirected graphical
models, making maximum likelihood learning a challenge. One way to overcome
this is to restrict parameters to a tractable set, most typically the set of
tree-structured parameters. This paper explores an alternative notion of a
tractable set, namely a set of "fast-mixing parameters" where Markov chain
Monte Carlo (MCMC) inference can be guaranteed to quickly converge to the
stationary distribution. While it is common in practice to approximate the
likelihood gradient using samples obtained from MCMC, such procedures lack
theoretical guarantees. This paper proves that for any exponential family with
bounded sufficient statistics, (not just graphical models) when parameters are
constrained to a fast-mixing set, gradient descent with gradients approximated
by sampling will approximate the maximum likelihood solution inside the set
with high-probability. When unregularized, to find a solution epsilon-accurate
in log-likelihood requires a total amount of effort cubic in 1/epsilon,
disregarding logarithmic factors. When ridge-regularized, strong convexity
allows a solution epsilon-accurate in parameter distance with effort quadratic
in 1/epsilon. Both of these provide of a fully-polynomial time randomized
approximation scheme.Comment: Advances in Neural Information Processing Systems 201
Glauber dynamics on trees:Boundary conditions and mixing time
We give the first comprehensive analysis of the effect of boundary conditions
on the mixing time of the Glauber dynamics in the so-called Bethe
approximation. Specifically, we show that spectral gap and the log-Sobolev
constant of the Glauber dynamics for the Ising model on an n-vertex regular
tree with plus-boundary are bounded below by a constant independent of n at all
temperatures and all external fields. This implies that the mixing time is
O(log n) (in contrast to the free boundary case, where it is not bounded by any
fixed polynomial at low temperatures). In addition, our methods yield simpler
proofs and stronger results for the spectral gap and log-Sobolev constant in
the regime where there are multiple phases but the mixing time is insensitive
to the boundary condition. Our techniques also apply to a much wider class of
models, including those with hard-core constraints like the antiferromagnetic
Potts model at zero temperature (proper colorings) and the hard--core lattice
gas (independent sets)
Ergodic properties of a model for turbulent dispersion of inertial particles
We study a simple stochastic differential equation that models the dispersion
of close heavy particles moving in a turbulent flow. In one and two dimensions,
the model is closely related to the one-dimensional stationary Schroedinger
equation in a random delta-correlated potential. The ergodic properties of the
dispersion process are investigated by proving that its generator is
hypoelliptic and using control theory
Gossip Algorithms for Distributed Signal Processing
Gossip algorithms are attractive for in-network processing in sensor networks
because they do not require any specialized routing, there is no bottleneck or
single point of failure, and they are robust to unreliable wireless network
conditions. Recently, there has been a surge of activity in the computer
science, control, signal processing, and information theory communities,
developing faster and more robust gossip algorithms and deriving theoretical
performance guarantees. This article presents an overview of recent work in the
area. We describe convergence rate results, which are related to the number of
transmitted messages and thus the amount of energy consumed in the network for
gossiping. We discuss issues related to gossiping over wireless links,
including the effects of quantization and noise, and we illustrate the use of
gossip algorithms for canonical signal processing tasks including distributed
estimation, source localization, and compression.Comment: Submitted to Proceedings of the IEEE, 29 page
- …