4 research outputs found

    Properties of stochastic Kronecker graphs

    Full text link
    The stochastic Kronecker graph model introduced by Leskovec et al. is a random graph with vertex set Z2n\mathbb Z_2^n, where two vertices uu and vv are connected with probability αuvγ(1u)(1v)βnuv(1u)(1v)\alpha^{{u}\cdot{v}}\gamma^{(1-{u})\cdot(1-{v})}\beta^{n-{u}\cdot{v}-(1-{u})\cdot(1-{v})} independently of the presence or absence of any other edge, for fixed parameters 0<α,β,γ<10<\alpha,\beta,\gamma<1. They have shown empirically that the degree sequence resembles a power law degree distribution. In this paper we show that the stochastic Kronecker graph a.a.s. does not feature a power law degree distribution for any parameters 0<α,β,γ<10<\alpha,\beta,\gamma<1. In addition, we analyze the number of subgraphs present in the stochastic Kronecker graph and study the typical neighborhood of any given vertex.Comment: 37 pages, 2 figure

    Analysis and Approximate Inference of Large Random Kronecker Graphs

    Full text link
    Random graph models are playing an increasingly important role in various fields ranging from social networks, telecommunication systems, to physiologic and biological networks. Within this landscape, the random Kronecker graph model, emerges as a prominent framework for scrutinizing intricate real-world networks. In this paper, we investigate large random Kronecker graphs, i.e., the number of graph vertices NN is large. Built upon recent advances in random matrix theory (RMT) and high-dimensional statistics, we prove that the adjacency of a large random Kronecker graph can be decomposed, in a spectral norm sense, into two parts: a small-rank (of rank O(logN)O(\log N)) signal matrix that is linear in the graph parameters and a zero-mean random noise matrix. Based on this result, we propose a ``denoise-and-solve'' approach to infer the key graph parameters, with significantly reduced computational complexity. Experiments on both graph inference and classification are presented to evaluate the our proposed method. In both tasks, the proposed approach yields comparable or advantageous performance, than widely-used graph inference (e.g., KronFit) and graph neural net baselines, at a time cost that scales linearly as the graph size NN.Comment: 27 pages, 5 figures, 2 table

    Giant components in Kronecker graphs

    No full text
    corecore