27 research outputs found

    Upper bounds for number of removed edges in the Erased Configuration Model

    Full text link
    Models for generating simple graphs are important in the study of real-world complex networks. A well established example of such a model is the erased configuration model, where each node receives a number of half-edges that are connected to half-edges of other nodes at random, and then self-loops are removed and multiple edges are concatenated to make the graph simple. Although asymptotic results for many properties of this model, such as the limiting degree distribution, are known, the exact speed of convergence in terms of the graph sizes remains an open question. We provide a first answer by analyzing the size dependence of the average number of removed edges in the erased configuration model. By combining known upper bounds with a Tauberian Theorem we obtain upper bounds for the number of removed edges, in terms of the size of the graph. Remarkably, when the degree distribution follows a power-law, we observe three scaling regimes, depending on the power law exponent. Our results provide a strong theoretical basis for evaluating finite-size effects in networks

    Spectral density of random graphs with topological constraints

    Full text link
    The spectral density of random graphs with topological constraints is analysed using the replica method. We consider graph ensembles featuring generalised degree-degree correlations, as well as those with a community structure. In each case an exact solution is found for the spectral density in the form of consistency equations depending on the statistical properties of the graph ensemble in question. We highlight the effect of these topological constraints on the resulting spectral density.Comment: 24 pages, 6 figure

    Construction and Random Generation of Hypergraphs with Prescribed Degree and Dimension Sequences

    Full text link
    We propose algorithms for construction and random generation of hypergraphs without loops and with prescribed degree and dimension sequences. The objective is to provide a starting point for as well as an alternative to Markov chain Monte Carlo approaches. Our algorithms leverage the transposition of properties and algorithms devised for matrices constituted of zeros and ones with prescribed row- and column-sums to hypergraphs. The construction algorithm extends the applicability of Markov chain Monte Carlo approaches when the initial hypergraph is not provided. The random generation algorithm allows the development of a self-normalised importance sampling estimator for hypergraph properties such as the average clustering coefficient.We prove the correctness of the proposed algorithms. We also prove that the random generation algorithm generates any hypergraph following the prescribed degree and dimension sequences with a non-zero probability. We empirically and comparatively evaluate the effectiveness and efficiency of the random generation algorithm. Experiments show that the random generation algorithm provides stable and accurate estimates of average clustering coefficient, and also demonstrates a better effective sample size in comparison with the Markov chain Monte Carlo approaches.Comment: 21 pages, 3 figure

    Permutation glass

    No full text

    Exact and Efficient Generation of Geometric Random Variates and Random Graphs

    No full text
    Abstract. The standard algorithm for fast generation of ErdƑs-RĂ©nyi random graphs only works in the Real RAM model. The critical point is the generation of geometric random variates Geo(p), for which there is no algorithm that is both exact and efficient in any bounded precision machine model. For a RAM model with word size w = ℩(log log(1/p)), we show that this is possible and present an exact algorithm for sampling Geo(p) in optimal expected time O(1 + log(1/p)/w). We also give an exact algorithm for sampling min{n, Geo(p)} in optimal expected time O(1+log(min{1/p, n})/w). This yields a new exact algorithm for sampling ErdƑs-RĂ©nyi and Chung-Lu random graphs of n vertices and m (expected) edges in optimal expected runtime O(n + m) on a RAM with word size w = Θ(log n).
    corecore