164,321 research outputs found

    Decay of Correlations for the Hardcore Model on the dd-regular Random Graph

    Full text link
    A key insight from statistical physics about spin systems on random graphs is the central role played by Gibbs measures on trees. We determine the local weak limit of the hardcore model on random regular graphs asymptotically until just below its condensation threshold, showing that it converges in probability locally in a strong sense to the free boundary condition Gibbs measure on the tree. As a consequence we show that the reconstruction threshold on the random graph, indicative of the onset of point to set spatial correlations, is equal to the reconstruction threshold on the dd-regular tree for which we determine precise asymptotics. We expect that our methods will generalize to a wide range of spin systems for which the second moment method holds.Comment: 39 pages, 5 figures. arXiv admin note: text overlap with arXiv:1004.353

    Fast Recognition of Partial Star Products and Quasi Cartesian Products

    Get PDF
    This paper is concerned with the fast computation of a relation R\R on the edge set of connected graphs that plays a decisive role in the recognition of approximate Cartesian products, the weak reconstruction of Cartesian products, and the recognition of Cartesian graph bundles with a triangle free basis. A special case of R\R is the relation δ∗\delta^\ast, whose convex closure yields the product relation σ\sigma that induces the prime factor decomposition of connected graphs with respect to the Cartesian product. For the construction of R\R so-called Partial Star Products are of particular interest. Several special data structures are used that allow to compute Partial Star Products in constant time. These computations are tuned to the recognition of approximate graph products, but also lead to a linear time algorithm for the computation of δ∗\delta^\ast for graphs with maximum bounded degree. Furthermore, we define \emph{quasi Cartesian products} as graphs with non-trivial δ∗\delta^\ast. We provide several examples, and show that quasi Cartesian products can be recognized in linear time for graphs with bounded maximum degree. Finally, we note that quasi products can be recognized in sublinear time with a parallelized algorithm

    Broadcasting with Random Matrices

    Full text link
    Motivated by the theory of spin-glasses in physics, we study the so-called reconstruction problem for the related distributions on the tree, and on the sparse random graph G(n,d/n)G(n,d/n). Both cases, reduce naturally to studying broadcasting models on the tree, where each edge has its own broadcasting matrix, and this matrix is drawn independently from a predefined distribution. In this context, we study the effect of the configuration at the root to that of the vertices at distance hh, as h→∞h\to\infty. We establish the reconstruction threshold for the cases where the broadcasting matrices give rise to symmetric, 2-spin Gibbs distributions. This threshold seems to be a natural extension of the well-known Kesten-Stigum bound which arises in the classic version of the reconstruction problem. Our results imply, as a special case, the reconstruction threshold for the well-known Edward-Anderson model of spin-glasses on the tree. Also, we extend our analysis to the setting of the Galton-Watson tree, and the random graph G(n,d/n)G(n,d/n), where we establish the corresponding thresholds.Interestingly, for the Edward-Anderson model on the random graph, we show that the replica symmetry breaking phase transition, established in [Guerra and Toninelli:2004], coincides with the reconstruction threshold. Compared to the classical Gibbs distributions, the spin-glasses have a lot of unique features. In that respect, their study calls for new ideas, e.g., we introduce novel estimators for the reconstruction problem. Furthermore, note that the main technical challenge in the analysis is the presence of (too) many levels of randomness. We manage to circumvent this problem by utilising recently proposed tools coming from the analysis of Markov chains

    Unified 2D and 3D Pre-Training of Molecular Representations

    Full text link
    Molecular representation learning has attracted much attention recently. A molecule can be viewed as a 2D graph with nodes/atoms connected by edges/bonds, and can also be represented by a 3D conformation with 3-dimensional coordinates of all atoms. We note that most previous work handles 2D and 3D information separately, while jointly leveraging these two sources may foster a more informative representation. In this work, we explore this appealing idea and propose a new representation learning method based on a unified 2D and 3D pre-training. Atom coordinates and interatomic distances are encoded and then fused with atomic representations through graph neural networks. The model is pre-trained on three tasks: reconstruction of masked atoms and coordinates, 3D conformation generation conditioned on 2D graph, and 2D graph generation conditioned on 3D conformation. We evaluate our method on 11 downstream molecular property prediction tasks: 7 with 2D information only and 4 with both 2D and 3D information. Our method achieves state-of-the-art results on 10 tasks, and the average improvement on 2D-only tasks is 8.3%. Our method also achieves significant improvement on two 3D conformation generation tasks.Comment: KDD-202
    • …
    corecore