177 research outputs found

    Traversals of Infinite Graphs with Random Local Orientations

    Full text link
    We introduce the notion of a "random basic walk" on an infinite graph, give numerous examples, list potential applications, and provide detailed comparisons between the random basic walk and existing generalizations of simple random walks. We define analogues in the setting of random basic walks of the notions of recurrence and transience in the theory of simple random walks, and we study the question of which graphs have a cycling random basic walk and which a transient random basic walk. We prove that cycles of arbitrary length are possible in any regular graph, but that they are unlikely. We give upper bounds on the expected number of vertices a random basic walk will visit on the infinite graphs studied and on their finite analogues of sufficiently large size. We then study random basic walks on complete graphs, and prove that the class of complete graphs has random basic walks asymptotically visit a constant fraction of the nodes. We end with numerous conjectures and problems for future study, as well as ideas for how to approach these problems.Comment: This is my masters thesis from Wesleyan University. Currently my advisor and I are selecting a journal where we will submit a shorter version. We plan to split this work into two papers: one for the case of infinite graphs and one for the finite case (which is not fully treated here

    Combinatorial Structures in Hypercubes

    Get PDF

    Rapid Mixing of Gibbs Sampling on Graphs that are Sparse on Average

    Get PDF
    In this work we show that for every d<d < \infty and the Ising model defined on G(n,d/n)G(n,d/n), there exists a βd>0\beta_d > 0, such that for all β<βd\beta < \beta_d with probability going to 1 as nn \to \infty, the mixing time of the dynamics on G(n,d/n)G(n,d/n) is polynomial in nn. Our results are the first polynomial time mixing results proven for a natural model on G(n,d/n)G(n,d/n) for d>1d > 1 where the parameters of the model do not depend on nn. They also provide a rare example where one can prove a polynomial time mixing of Gibbs sampler in a situation where the actual mixing time is slower than n \polylog(n). Our proof exploits in novel ways the local treelike structure of Erd\H{o}s-R\'enyi random graphs, comparison and block dynamics arguments and a recent result of Weitz. Our results extend to much more general families of graphs which are sparse in some average sense and to much more general interactions. In particular, they apply to any graph for which every vertex vv of the graph has a neighborhood N(v)N(v) of radius O(logn)O(\log n) in which the induced sub-graph is a tree union at most O(logn)O(\log n) edges and where for each simple path in N(v)N(v) the sum of the vertex degrees along the path is O(logn)O(\log n). Moreover, our result apply also in the case of arbitrary external fields and provide the first FPRAS for sampling the Ising distribution in this case. We finally present a non Markov Chain algorithm for sampling the distribution which is effective for a wider range of parameters. In particular, for G(n,d/n)G(n,d/n) it applies for all external fields and β<βd\beta < \beta_d, where dtanh(βd)=1d \tanh(\beta_d) = 1 is the critical point for decay of correlation for the Ising model on G(n,d/n)G(n,d/n).Comment: Corrected proof of Lemma 2.

    Solving Multiple Inference in Graphical Models

    Full text link
    For inference problems in graphical models, much effort has been directed at algorithms for obtaining one single optimal prediction. In practice, the data is often noisy or incomplete, which makes one single optimal solution unreliable. To address this problem, multiple Inference is proposed to find several best solutions, M-Best, where multiple hypotheses are preferred for advanced reasoning. People use oracle accuracy as an evaluation criterion expecting one of the solutions has high accuracy with the ground truth. It has been shown that it is beneficial for the top solutions to be diverse. Approaches for solving diverse multiple inference are proposed such as Diverse M-Best and M-Modes. They rely on hyper-parameters in enforcing diversity. Works keep optimizing the efficiency of solving difficult M-Modes problems by using an intelligent heuristic search on tree decompositions. The newest Min-Loss M-Best introduces a parameter-free method that directly minimizes the expected loss to simultaneously find the multiple top solution set
    corecore