43 research outputs found

    Regular decomposition of large graphs and other structures: scalability and robustness towards missing data

    Full text link
    A method for compression of large graphs and matrices to a block structure is further developed. Szemer\'edi's regularity lemma is used as a generic motivation of the significance of stochastic block models. Another ingredient of the method is Rissanen's minimum description length principle (MDL). We continue our previous work on the subject, considering cases of missing data and scaling of algorithms to extremely large size of graphs. In this way it would be possible to find out a large scale structure of a huge graphs of certain type using only a tiny part of graph information and obtaining a compact representation of such graphs useful in computations and visualization.Comment: Accepted for publication in: Fourth International Workshop on High Performance Big Graph Data Management, Analysis, and Mining, December 11, 2017, Bosto U.S.

    Towards analyzing large graphs with quantum annealing

    Get PDF

    On the stability of two-chunk file-sharing systems

    Full text link
    We consider five different peer-to-peer file sharing systems with two chunks, with the aim of finding chunk selection algorithms that have provably stable performance with any input rate and assuming non-altruistic peers who leave the system immediately after downloading the second chunk. We show that many algorithms that first looked promising lead to unstable or oscillating behavior. However, we end up with a system with desirable properties. Most of our rigorous results concern the corresponding deterministic large system limits, but in two simplest cases we provide proofs for the stochastic systems also.Comment: 19 pages, 7 figure

    Regular Decomposition of Large Graphs: Foundation of a Sampling Approach to Stochastic Block Model Fitting

    Get PDF
    We analyze the performance of regular decomposition, a method for compression of large and dense graphs. This method is inspired by Szemerédi’s regularity lemma (SRL), a generic structural result of large and dense graphs. In our method, stochastic block model (SBM) is used as a model in maximum likelihood fitting to find a regular structure similar to the one predicted by SRL. Another ingredient of our method is Rissanen’s minimum description length principle (MDL). We consider scaling of algorithms to extremely large size of graphs by sampling a small subgraph. We continue our previous work on the subject by proving some experimentally found claims. Our theoretical setting does not assume that the graph is generated from a SBM. The task is to find a SBM that is optimal for modeling the given graph in the sense of MDL. This assumption matches with real-life situations when no random generative model is appropriate. Our aim is to show that regular decomposition is a viable and robust method for large graphs emerging, say, in Big Data area.Peer reviewe

    A stable random-contact algorithm for peer-to-peer file sharing

    No full text
    corecore