494 research outputs found

    Fastest mixing Markov chain on graphs with symmetries

    Full text link
    We show how to exploit symmetries of a graph to efficiently compute the fastest mixing Markov chain on the graph (i.e., find the transition probabilities on the edges to minimize the second-largest eigenvalue modulus of the transition probability matrix). Exploiting symmetry can lead to significant reduction in both the number of variables and the size of matrices in the corresponding semidefinite program, thus enable numerical solution of large-scale instances that are otherwise computationally infeasible. We obtain analytic or semi-analytic results for particular classes of graphs, such as edge-transitive and distance-transitive graphs. We describe two general approaches for symmetry exploitation, based on orbit theory and block-diagonalization, respectively. We also establish the connection between these two approaches.Comment: 39 pages, 15 figure

    Improved mixing rates of directed cycles by added connection

    Get PDF
    We investigate the mixing rate of a Markov chain where a combination of long distance edges and non-reversibility is introduced: as a first step, we focus here on the following graphs: starting from the cycle graph, we select random nodes and add all edges connecting them. We prove a square factor improvement of the mixing rate compared to the reversible version of the Markov chain

    Accelerating Consensus by Spectral Clustering and Polynomial Filters

    Get PDF
    It is known that polynomial filtering can accelerate the convergence towards average consensus on an undirected network. In this paper the gain of a second-order filtering is investigated. A set of graphs is determined for which consensus can be attained in finite time, and a preconditioner is proposed to adapt the undirected weights of any given graph to achieve fastest convergence with the polynomial filter. The corresponding cost function differs from the traditional spectral gap, as it favors grouping the eigenvalues in two clusters. A possible loss of robustness of the polynomial filter is also highlighted

    Lifted Probabilistic Inference: An MCMC Perspective

    Get PDF
    The general consensus seems to be that lifted inference is concerned with exploiting model symmetries and grouping indistinguishable objects at inference time. Since first-order probabilistic formalisms are essentially tem- plate languages providing a more compact representation of a corresponding ground model, lifted inference tends to work especially well in these models. We show that the notion of indistinguishability manifests itself on several dferent levels {the level of constants, the level of ground atoms (variables), the level of formulas (features), and the level of assignments (possible worlds). We discuss existing work in the MCMC literature on ex- ploiting symmetries on the level of variable assignments and relate it to novel results in lifted MCMC
    • …
    corecore