9,878 research outputs found

    A Faster Approximation Algorithm for the Gibbs Partition Function

    Get PDF
    We consider the problem of estimating the partition function Z(β)=xexp(β(H(x))Z(\beta)=\sum_x \exp(-\beta(H(x)) of a Gibbs distribution with a Hamilton H()H(\cdot), or more precisely the logarithm of the ratio q=lnZ(0)/Z(β)q=\ln Z(0)/Z(\beta). It has been recently shown how to approximate qq with high probability assuming the existence of an oracle that produces samples from the Gibbs distribution for a given parameter value in [0,β][0,\beta]. The current best known approach due to Huber [9] uses O(qlnn[lnq+lnlnn+ε2])O(q\ln n\cdot[\ln q + \ln \ln n+\varepsilon^{-2}]) oracle calls on average where ε\varepsilon is the desired accuracy of approximation and H()H(\cdot) is assumed to lie in {0}[1,n]\{0\}\cup[1,n]. We improve the complexity to O(qlnnε2)O(q\ln n\cdot\varepsilon^{-2}) oracle calls. We also show that the same complexity can be achieved if exact oracles are replaced with approximate sampling oracles that are within O(ε2qlnn)O(\frac{\varepsilon^2}{q\ln n}) variation distance from exact oracles. Finally, we prove a lower bound of Ω(qε2)\Omega(q\cdot \varepsilon^{-2}) oracle calls under a natural model of computation

    On Sampling from the Gibbs Distribution with Random Maximum A-Posteriori Perturbations

    Full text link
    In this paper we describe how MAP inference can be used to sample efficiently from Gibbs distributions. Specifically, we provide means for drawing either approximate or unbiased samples from Gibbs' distributions by introducing low dimensional perturbations and solving the corresponding MAP assignments. Our approach also leads to new ways to derive lower bounds on partition functions. We demonstrate empirically that our method excels in the typical "high signal - high coupling" regime. The setting results in ragged energy landscapes that are challenging for alternative approaches to sampling and/or lower bounds

    Monte Carlo Algorithms for the Partition Function and Information Rates of Two-Dimensional Channels

    Full text link
    The paper proposes Monte Carlo algorithms for the computation of the information rate of two-dimensional source/channel models. The focus of the paper is on binary-input channels with constraints on the allowed input configurations. The problem of numerically computing the information rate, and even the noiseless capacity, of such channels has so far remained largely unsolved. Both problems can be reduced to computing a Monte Carlo estimate of a partition function. The proposed algorithms use tree-based Gibbs sampling and multilayer (multitemperature) importance sampling. The viability of the proposed algorithms is demonstrated by simulation results

    Approximation algorithms for the normalizing constant of Gibbs distributions

    Full text link
    Consider a family of distributions {πβ}\{\pi_{\beta}\} where XπβX\sim\pi_{\beta} means that P(X=x)=exp(βH(x))/Z(β)\mathbb{P}(X=x)=\exp(-\beta H(x))/Z(\beta). Here Z(β)Z(\beta) is the proper normalizing constant, equal to xexp(βH(x))\sum_x\exp(-\beta H(x)). Then {πβ}\{\pi_{\beta}\} is known as a Gibbs distribution, and Z(β)Z(\beta) is the partition function. This work presents a new method for approximating the partition function to a specified level of relative accuracy using only a number of samples, that is, O(ln(Z(β))ln(ln(Z(β))))O(\ln(Z(\beta))\ln(\ln(Z(\beta)))) when Z(0)1Z(0)\geq1. This is a sharp improvement over previous, similar approaches that used a much more complicated algorithm, requiring O(ln(Z(β))ln(ln(Z(β)))5)O(\ln(Z(\beta))\ln(\ln(Z(\beta)))^5) samples.Comment: Published in at http://dx.doi.org/10.1214/14-AAP1015 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Fast Algorithms at Low Temperatures via Markov Chains

    Get PDF
    For spin systems, such as the hard-core model on independent sets weighted by fugacity lambda>0, efficient algorithms for the associated approximate counting/sampling problems typically apply in the high-temperature region, corresponding to low fugacity. Recent work of Jenssen, Keevash and Perkins (2019) yields an FPTAS for approximating the partition function (and an efficient sampling algorithm) on bounded-degree (bipartite) expander graphs for the hard-core model at sufficiently high fugacity, and also the ferromagnetic Potts model at sufficiently low temperatures. Their method is based on using the cluster expansion to obtain a complex zero-free region for the partition function of a polymer model, and then approximating this partition function using the polynomial interpolation method of Barvinok. We present a simple discrete-time Markov chain for abstract polymer models, and present an elementary proof of rapid mixing of this new chain under sufficient decay of the polymer weights. Applying these general polymer results to the hard-core and ferromagnetic Potts models on bounded-degree (bipartite) expander graphs yields fast algorithms with running time O(n log n) for the Potts model and O(n^2 log n) for the hard-core model, in contrast to typical running times of n^{O(log Delta)} for algorithms based on Barvinok\u27s polynomial interpolation method on graphs of maximum degree Delta. In addition, our approach via our polymer model Markov chain is conceptually simpler as it circumvents the zero-free analysis and the generalization to complex parameters. Finally, we combine our results for the hard-core and ferromagnetic Potts models with standard Markov chain comparison tools to obtain polynomial mixing time for the usual spin system Glauber dynamics restricted to even and odd or "red" dominant portions of the respective state spaces

    Sequential Monte Carlo for Graphical Models

    Full text link
    We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM). Via a sequential decomposition of the PGM we find a sequence of auxiliary distributions defined on a monotonically increasing sequence of probability spaces. By targeting these auxiliary distributions using SMC we are able to approximate the full joint distribution defined by the PGM. One of the key merits of the SMC sampler is that it provides an unbiased estimate of the partition function of the model. We also show how it can be used within a particle Markov chain Monte Carlo framework in order to construct high-dimensional block-sampling algorithms for general PGMs
    corecore