9,878 research outputs found
A Faster Approximation Algorithm for the Gibbs Partition Function
We consider the problem of estimating the partition function of a Gibbs distribution with a Hamilton , or more
precisely the logarithm of the ratio . It has been
recently shown how to approximate with high probability assuming the
existence of an oracle that produces samples from the Gibbs distribution for a
given parameter value in . The current best known approach due to
Huber [9] uses oracle
calls on average where is the desired accuracy of approximation
and is assumed to lie in . We improve the complexity
to oracle calls. We also show that the same
complexity can be achieved if exact oracles are replaced with approximate
sampling oracles that are within variation
distance from exact oracles. Finally, we prove a lower bound of oracle calls under a natural model of computation
On Sampling from the Gibbs Distribution with Random Maximum A-Posteriori Perturbations
In this paper we describe how MAP inference can be used to sample efficiently
from Gibbs distributions. Specifically, we provide means for drawing either
approximate or unbiased samples from Gibbs' distributions by introducing low
dimensional perturbations and solving the corresponding MAP assignments. Our
approach also leads to new ways to derive lower bounds on partition functions.
We demonstrate empirically that our method excels in the typical "high signal -
high coupling" regime. The setting results in ragged energy landscapes that are
challenging for alternative approaches to sampling and/or lower bounds
Monte Carlo Algorithms for the Partition Function and Information Rates of Two-Dimensional Channels
The paper proposes Monte Carlo algorithms for the computation of the
information rate of two-dimensional source/channel models. The focus of the
paper is on binary-input channels with constraints on the allowed input
configurations. The problem of numerically computing the information rate, and
even the noiseless capacity, of such channels has so far remained largely
unsolved. Both problems can be reduced to computing a Monte Carlo estimate of a
partition function. The proposed algorithms use tree-based Gibbs sampling and
multilayer (multitemperature) importance sampling. The viability of the
proposed algorithms is demonstrated by simulation results
Approximation algorithms for the normalizing constant of Gibbs distributions
Consider a family of distributions where
means that . Here is the
proper normalizing constant, equal to . Then
is known as a Gibbs distribution, and is the
partition function. This work presents a new method for approximating the
partition function to a specified level of relative accuracy using only a
number of samples, that is, when
. This is a sharp improvement over previous, similar approaches that
used a much more complicated algorithm, requiring
samples.Comment: Published in at http://dx.doi.org/10.1214/14-AAP1015 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Fast Algorithms at Low Temperatures via Markov Chains
For spin systems, such as the hard-core model on independent sets weighted by fugacity lambda>0, efficient algorithms for the associated approximate counting/sampling problems typically apply in the high-temperature region, corresponding to low fugacity. Recent work of Jenssen, Keevash and Perkins (2019) yields an FPTAS for approximating the partition function (and an efficient sampling algorithm) on bounded-degree (bipartite) expander graphs for the hard-core model at sufficiently high fugacity, and also the ferromagnetic Potts model at sufficiently low temperatures. Their method is based on using the cluster expansion to obtain a complex zero-free region for the partition function of a polymer model, and then approximating this partition function using the polynomial interpolation method of Barvinok. We present a simple discrete-time Markov chain for abstract polymer models, and present an elementary proof of rapid mixing of this new chain under sufficient decay of the polymer weights. Applying these general polymer results to the hard-core and ferromagnetic Potts models on bounded-degree (bipartite) expander graphs yields fast algorithms with running time O(n log n) for the Potts model and O(n^2 log n) for the hard-core model, in contrast to typical running times of n^{O(log Delta)} for algorithms based on Barvinok\u27s polynomial interpolation method on graphs of maximum degree Delta. In addition, our approach via our polymer model Markov chain is conceptually simpler as it circumvents the zero-free analysis and the generalization to complex parameters. Finally, we combine our results for the hard-core and ferromagnetic Potts models with standard Markov chain comparison tools to obtain polynomial mixing time for the usual spin system Glauber dynamics restricted to even and odd or "red" dominant portions of the respective state spaces
Sequential Monte Carlo for Graphical Models
We propose a new framework for how to use sequential Monte Carlo (SMC)
algorithms for inference in probabilistic graphical models (PGM). Via a
sequential decomposition of the PGM we find a sequence of auxiliary
distributions defined on a monotonically increasing sequence of probability
spaces. By targeting these auxiliary distributions using SMC we are able to
approximate the full joint distribution defined by the PGM. One of the key
merits of the SMC sampler is that it provides an unbiased estimate of the
partition function of the model. We also show how it can be used within a
particle Markov chain Monte Carlo framework in order to construct
high-dimensional block-sampling algorithms for general PGMs
- …