99 research outputs found

    The probability that a random multigraph is simple

    Full text link
    Consider a random multigraph G* with given vertex degrees d_1,...,d_n, contructed by the configuration model. We show that, asymptotically for a sequence of such multigraphs with the number of edges (d_1+...+d_n)/2 tending to infinity, the probability that the multigraph is simple stays away from 0 if and only if \sum d_i^2=O(\sum d_i). This was previously known only under extra assumtions on the maximum degree. We also give an asymptotic formula for this probability, extending previous results by several authors.Comment: 24 page

    Mixing times of random walks on dynamic configuration models

    Get PDF
    The mixing time of a random walk, with or without backtracking, on a random graph generated according to the configuration model on nn vertices, is known to be of order logn\log n. In this paper we investigate what happens when the random graph becomes {\em dynamic}, namely, at each unit of time a fraction αn\alpha_n of the edges is randomly rewired. Under mild conditions on the degree sequence, guaranteeing that the graph is locally tree-like, we show that for every ε(0,1)\varepsilon\in(0,1) the ε\varepsilon-mixing time of random walk without backtracking grows like 2log(1/ε)/log(1/(1αn))\sqrt{2\log(1/\varepsilon)/\log(1/(1-\alpha_n))} as nn \to \infty, provided that limnαn(logn)2=\lim_{n\to\infty} \alpha_n(\log n)^2=\infty. The latter condition corresponds to a regime of fast enough graph dynamics. Our proof is based on a randomised stopping time argument, in combination with coupling techniques and combinatorial estimates. The stopping time of interest is the first time that the walk moves along an edge that was rewired before, which turns out to be close to a strong stationary time.Comment: 23 pages, 6 figures. Previous version contained a mistake in one of the proofs. In this version we look at nonbacktracking random walk instead of simple random wal

    On Bootstrap Percolation in Living Neural Networks

    Full text link
    Recent experimental studies of living neural networks reveal that their global activation induced by electrical stimulation can be explained using the concept of bootstrap percolation on a directed random network. The experiment consists in activating externally an initial random fraction of the neurons and observe the process of firing until its equilibrium. The final portion of neurons that are active depends in a non linear way on the initial fraction. The main result of this paper is a theorem which enables us to find the asymptotic of final proportion of the fired neurons in the case of random directed graphs with given node degrees as the model for interacting network. This gives a rigorous mathematical proof of a phenomena observed by physicists in neural networks
    corecore