13 research outputs found

    On the switch Markov chain for perfect matchings

    Get PDF
    We study a simple Markov chain, the switch chain, on the set of all perfect matchings in a bipartite graph. This Markov chain was proposed by Diaconis, Graham and Holmes as a possible approach to a sampling problem arising in Statistics. We ask: for which hereditary classes of graphs is the Markov chain ergodic and for which is it rapidly mixing? We provide a precise answer to the ergodicity question and close bounds on the mixing question. We show for the first time that the mixing time of the switch chain is polynomial in the case of monotone graphs, a class that includes examples of interest in the statistical setting

    Enumeration of maximum matchings of graphs

    Full text link
    Counting maximum matchings in a graph is of great interest in statistical mechanics, solid-state chemistry, theoretical computer science, mathematics, among other disciplines. However, it is a challengeable problem to explicitly determine the number of maximum matchings of general graphs. In this paper, using Gallai-Edmonds structure theorem, we derive a computing formula for the number of maximum matching in a graph. According to the formula, we obtain an algorithm to enumerate maximum matchings of a graph. In particular, The formula implies that computing the number of maximum matchings of a graph is converted to compute the number of perfect matchings of some induced subgraphs of the graph. As an application, we calculate the number of maximum matchings of opt trees. The result extends a conclusion obtained by Heuberger and Wagner[C. Heuberger, S. Wagner, The number of maximum matchings in a tree, Discrete Math. 311 (2011) 2512--2542]

    Counting Perfect Matchings and the Switch Chain

    Get PDF
    We examine the problem of exactly or approximately counting all perfect matchings in hereditary classes of nonbipartite graphs. In particular, we consider the switch Markov chain of Diaconis, Graham, and Holmes. We determine the largest hereditary class for which the chain is ergodic, and define a large new hereditary class of graphs for which it is rapidly mixing. We go on to show that the chain has exponential mixing time for a slightly larger class. We also examine the question of ergodicity of the switch chain in an arbitrary graph. Finally, we give exact counting algorithms for three classes

    Rapid mixing of the switch Markov chain for strongly stable degree sequences

    Get PDF
    The switch Markov chain has been extensively studied as the most natural Markov chain Monte Carlo approach for sampling graphs with prescribed degree sequences. We show that the switch chain for sampling simple undirected graphs with a given degree sequence is rapidly mixing when the degree sequence is so‐called strongly stable. Strong stability is satisfied by all degree sequences for which the switch chain was known to be rapidly mixing based on Sinclair's multicommodity flow method up until a recent manuscript of Erdős and coworkers in 2019. Our approach relies on an embedding argument, involving a Markov chain defined by Jerrum and Sinclair in 1990. This results in a much shorter proof that unifies (almost) all the rapid mixing results for the switch chain in the literature, and extends them up to sharp characterizations of P‐stable degree sequences. In particular, our work resolves an open problem posed by Greenhill and Sfragara in 2017

    Zeros of Holant problems: locations and algorithms

    Get PDF
    We present fully polynomial-time (deterministic or randomised) approximation schemes for Holant problems, defined by a non-negative constraint function satisfying a generalised second order recurrence modulo a couple of exceptional cases. As a consequence, any non-negative Holant problem on cubic graphs has an efficient approximation algorithm unless the problem is equivalent to approximately counting perfect matchings, a central open problem in the area. This is in sharp contrast to the computational phase transition shown by 2-state spin systems on cubic graphs. Our main technique is the recently established connection between zeros of graph polynomials and approximate counting. We also use the "winding" technique to deduce the second result on cubic graphs

    Rapid mixing of the switch Markov chain for strongly stable degree sequences and 2-class joint degree matrices

    Get PDF
    The switch Markov chain has been extensively studied as the most natural Markov Chain Monte Carlo approach for sampling graphs with prescribed degree sequences. We use comparison arguments with other, less natural but simpler to analyze, Markov chains, to show that the switch chain mixes rapidly in two different settings. We first study the classic problem of uniformly sampling simple undirected, as well as bipartite, graphs with a given degree sequence. We apply an embedding argument, involving a Markov chain defined by Jerrum and Sinclair (TCS, 1990) for sampling graphs that almost have a given degree sequence, to show rapid mixing for degree sequences satisfying strong stability, a notion closely related to P-stability. This results in a much shorter proof that unifies the currently known rapid mixing results of the switch chain and extends them up to sharp characterizations of P-stability. In particular, our work resolves an open problem posed by Greenhill (SODA, 2015).Secondly, in order to illustrate the power of our approach, we study the problem of uniformly sampling graphs for which, in addition to the degree sequence, a joint degree distribution is given. Although the problem was formalized over a decade ago, and despite its practical significance in generating synthetic network topologies, small progress has been made on the random sampling of such graphs. The case of a single degree class reduces to sampling of regular graphs, but beyond this almost nothing is known. We fully resolve the case of two degree classes, by showing that the switch Markov chain is always rapidly mixing. Again, we first analyze an auxiliary chain for strongly stable instances on an augmented state space and then use an embedding argument.</p
    corecore