322 research outputs found

    Coupling with the stationary distribution and improved sampling for colorings and independent sets

    Full text link
    We present an improved coupling technique for analyzing the mixing time of Markov chains. Using our technique, we simplify and extend previous results for sampling colorings and independent sets. Our approach uses properties of the stationary distribution to avoid worst-case configurations which arise in the traditional approach. As an application, we show that for k/Δ>1.764k/\Delta >1.764, the Glauber dynamics on kk-colorings of a graph on nn vertices with maximum degree Δ\Delta converges in O(nlogn)O(n\log n) steps, assuming Δ=Ω(logn)\Delta =\Omega(\log n) and that the graph is triangle-free. Previously, girth 5\ge 5 was needed. As a second application, we give a polynomial-time algorithm for sampling weighted independent sets from the Gibbs distribution of the hard-core lattice gas model at fugacity λ<(1ϵ)e/Δ\lambda <(1-\epsilon)e/\Delta, on a regular graph GG on nn vertices of degree Δ=Ω(logn)\Delta =\Omega(\log n) and girth 6\ge 6. The best known algorithm for general graphs currently assumes λ<2/(Δ2)\lambda <2/(\Delta -2).Comment: Published at http://dx.doi.org/10.1214/105051606000000330 in the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Fast Algorithms at Low Temperatures via Markov Chains

    Get PDF
    For spin systems, such as the hard-core model on independent sets weighted by fugacity lambda>0, efficient algorithms for the associated approximate counting/sampling problems typically apply in the high-temperature region, corresponding to low fugacity. Recent work of Jenssen, Keevash and Perkins (2019) yields an FPTAS for approximating the partition function (and an efficient sampling algorithm) on bounded-degree (bipartite) expander graphs for the hard-core model at sufficiently high fugacity, and also the ferromagnetic Potts model at sufficiently low temperatures. Their method is based on using the cluster expansion to obtain a complex zero-free region for the partition function of a polymer model, and then approximating this partition function using the polynomial interpolation method of Barvinok. We present a simple discrete-time Markov chain for abstract polymer models, and present an elementary proof of rapid mixing of this new chain under sufficient decay of the polymer weights. Applying these general polymer results to the hard-core and ferromagnetic Potts models on bounded-degree (bipartite) expander graphs yields fast algorithms with running time O(n log n) for the Potts model and O(n^2 log n) for the hard-core model, in contrast to typical running times of n^{O(log Delta)} for algorithms based on Barvinok\u27s polynomial interpolation method on graphs of maximum degree Delta. In addition, our approach via our polymer model Markov chain is conceptually simpler as it circumvents the zero-free analysis and the generalization to complex parameters. Finally, we combine our results for the hard-core and ferromagnetic Potts models with standard Markov chain comparison tools to obtain polynomial mixing time for the usual spin system Glauber dynamics restricted to even and odd or "red" dominant portions of the respective state spaces

    Phase transition for the mixing time of the Glauber dynamics for coloring regular trees

    Full text link
    We prove that the mixing time of the Glauber dynamics for random k-colorings of the complete tree with branching factor b undergoes a phase transition at k=b(1+ob(1))/lnbk=b(1+o_b(1))/\ln{b}. Our main result shows nearly sharp bounds on the mixing time of the dynamics on the complete tree with n vertices for k=Cb/lnbk=Cb/\ln{b} colors with constant C. For C1C\geq1 we prove the mixing time is O(n1+ob(1)lnn)O(n^{1+o_b(1)}\ln{n}). On the other side, for C<1C<1 the mixing time experiences a slowing down; in particular, we prove it is O(n1/C+ob(1)lnn)O(n^{1/C+o_b(1)}\ln{n}) and Ω(n1/Cob(1))\Omega(n^{1/C-o_b(1)}). The critical point C=1 is interesting since it coincides (at least up to first order) with the so-called reconstruction threshold which was recently established by Sly. The reconstruction threshold has been of considerable interest recently since it appears to have close connections to the efficiency of certain local algorithms, and this work was inspired by our attempt to understand these connections in this particular setting.Comment: Published in at http://dx.doi.org/10.1214/11-AAP833 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore