3,809 research outputs found

    The Physics of Communicability in Complex Networks

    Full text link
    A fundamental problem in the study of complex networks is to provide quantitative measures of correlation and information flow between different parts of a system. To this end, several notions of communicability have been introduced and applied to a wide variety of real-world networks in recent years. Several such communicability functions are reviewed in this paper. It is emphasized that communication and correlation in networks can take place through many more routes than the shortest paths, a fact that may not have been sufficiently appreciated in previously proposed correlation measures. In contrast to these, the communicability measures reviewed in this paper are defined by taking into account all possible routes between two nodes, assigning smaller weights to longer ones. This point of view naturally leads to the definition of communicability in terms of matrix functions, such as the exponential, resolvent, and hyperbolic functions, in which the matrix argument is either the adjacency matrix or the graph Laplacian associated with the network. Considerable insight on communicability can be gained by modeling a network as a system of oscillators and deriving physical interpretations, both classical and quantum-mechanical, of various communicability functions. Applications of communicability measures to the analysis of complex systems are illustrated on a variety of biological, physical and social networks. The last part of the paper is devoted to a review of the notion of locality in complex networks and to computational aspects that by exploiting sparsity can greatly reduce the computational efforts for the calculation of communicability functions for large networks.Comment: Review Article. 90 pages, 14 figures. Contents: Introduction; Communicability in Networks; Physical Analogies; Comparing Communicability Functions; Communicability and the Analysis of Networks; Communicability and Localization in Complex Networks; Computability of Communicability Functions; Conclusions and Prespective

    GLB: Lifeline-based Global Load Balancing library in X10

    Full text link
    We present GLB, a programming model and an associated implementation that can handle a wide range of irregular paral- lel programming problems running over large-scale distributed systems. GLB is applicable both to problems that are easily load-balanced via static scheduling and to problems that are hard to statically load balance. GLB hides the intricate syn- chronizations (e.g., inter-node communication, initialization and startup, load balancing, termination and result collection) from the users. GLB internally uses a version of the lifeline graph based work-stealing algorithm proposed by Saraswat et al. Users of GLB are simply required to write several pieces of sequential code that comply with the GLB interface. GLB then schedules and orchestrates the parallel execution of the code correctly and efficiently at scale. We have applied GLB to two representative benchmarks: Betweenness Centrality (BC) and Unbalanced Tree Search (UTS). Among them, BC can be statically load-balanced whereas UTS cannot. In either case, GLB scales well-- achieving nearly linear speedup on different computer architectures (Power, Blue Gene/Q, and K) -- up to 16K cores

    Average Distance Queries through Weighted Samples in Graphs and Metric Spaces: High Scalability with Tight Statistical Guarantees

    Get PDF
    The average distance from a node to all other nodes in a graph, or from a query point in a metric space to a set of points, is a fundamental quantity in data analysis. The inverse of the average distance, known as the (classic) closeness centrality of a node, is a popular importance measure in the study of social networks. We develop novel structural insights on the sparsifiability of the distance relation via weighted sampling. Based on that, we present highly practical algorithms with strong statistical guarantees for fundamental problems. We show that the average distance (and hence the centrality) for all nodes in a graph can be estimated using O(ϵ2)O(\epsilon^{-2}) single-source distance computations. For a set VV of nn points in a metric space, we show that after preprocessing which uses O(n)O(n) distance computations we can compute a weighted sample SVS\subset V of size O(ϵ2)O(\epsilon^{-2}) such that the average distance from any query point vv to VV can be estimated from the distances from vv to SS. Finally, we show that for a set of points VV in a metric space, we can estimate the average pairwise distance using O(n+ϵ2)O(n+\epsilon^{-2}) distance computations. The estimate is based on a weighted sample of O(ϵ2)O(\epsilon^{-2}) pairs of points, which is computed using O(n)O(n) distance computations. Our estimates are unbiased with normalized mean square error (NRMSE) of at most ϵ\epsilon. Increasing the sample size by a O(logn)O(\log n) factor ensures that the probability that the relative error exceeds ϵ\epsilon is polynomially small.Comment: 21 pages, will appear in the Proceedings of RANDOM 201

    Communicability across evolving networks

    Get PDF
    Many natural and technological applications generate time ordered sequences of networks, defined over a fixed set of nodes; for example time-stamped information about ‘who phoned who’ or ‘who came into contact with who’ arise naturally in studies of communication and the spread of disease. Concepts and algorithms for static networks do not immediately carry through to this dynamic setting. For example, suppose A and B interact in the morning, and then B and C interact in the afternoon. Information, or disease, may then pass from A to C, but not vice versa. This subtlety is lost if we simply summarize using the daily aggregate network given by the chain A-B-C. However, using a natural definition of a walk on an evolving network, we show that classic centrality measures from the static setting can be extended in a computationally convenient manner. In particular, communicability indices can be computed to summarize the ability of each node to broadcast and receive information. The computations involve basic operations in linear algebra, and the asymmetry caused by time’s arrow is captured naturally through the non-mutativity of matrix-matrix multiplication. Illustrative examples are given for both synthetic and real-world communication data sets. We also discuss the use of the new centrality measures for real-time monitoring and prediction

    Generating realistic scaled complex networks

    Get PDF
    Research on generative models is a central project in the emerging field of network science, and it studies how statistical patterns found in real networks could be generated by formal rules. Output from these generative models is then the basis for designing and evaluating computational methods on networks, and for verification and simulation studies. During the last two decades, a variety of models has been proposed with an ultimate goal of achieving comprehensive realism for the generated networks. In this study, we (a) introduce a new generator, termed ReCoN; (b) explore how ReCoN and some existing models can be fitted to an original network to produce a structurally similar replica, (c) use ReCoN to produce networks much larger than the original exemplar, and finally (d) discuss open problems and promising research directions. In a comparative experimental study, we find that ReCoN is often superior to many other state-of-the-art network generation methods. We argue that ReCoN is a scalable and effective tool for modeling a given network while preserving important properties at both micro- and macroscopic scales, and for scaling the exemplar data by orders of magnitude in size.Comment: 26 pages, 13 figures, extended version, a preliminary version of the paper was presented at the 5th International Workshop on Complex Networks and their Application

    Efficient Exact and Approximate Algorithms for Computing Betweenness Centrality in Directed Graphs

    Full text link
    Graphs are an important tool to model data in different domains, including social networks, bioinformatics and the world wide web. Most of the networks formed in these domains are directed graphs, where all the edges have a direction and they are not symmetric. Betweenness centrality is an important index widely used to analyze networks. In this paper, first given a directed network GG and a vertex rV(G)r \in V(G), we propose a new exact algorithm to compute betweenness score of rr. Our algorithm pre-computes a set RV(r)\mathcal{RV}(r), which is used to prune a huge amount of computations that do not contribute in the betweenness score of rr. Time complexity of our exact algorithm depends on RV(r)|\mathcal{RV}(r)| and it is respectively Θ(RV(r)E(G))\Theta(|\mathcal{RV}(r)|\cdot|E(G)|) and Θ(RV(r)E(G)+RV(r)V(G)logV(G))\Theta(|\mathcal{RV}(r)|\cdot|E(G)|+|\mathcal{RV}(r)|\cdot|V(G)|\log |V(G)|) for unweighted graphs and weighted graphs with positive weights. RV(r)|\mathcal{RV}(r)| is bounded from above by V(G)1|V(G)|-1 and in most cases, it is a small constant. Then, for the cases where RV(r)\mathcal{RV}(r) is large, we present a simple randomized algorithm that samples from RV(r)\mathcal{RV}(r) and performs computations for only the sampled elements. We show that this algorithm provides an (ϵ,δ)(\epsilon,\delta)-approximation of the betweenness score of rr. Finally, we perform extensive experiments over several real-world datasets from different domains for several randomly chosen vertices as well as for the vertices with the highest betweenness scores. Our experiments reveal that in most cases, our algorithm significantly outperforms the most efficient existing randomized algorithms, in terms of both running time and accuracy. Our experiments also show that our proposed algorithm computes betweenness scores of all vertices in the sets of sizes 5, 10 and 15, much faster and more accurate than the most efficient existing algorithms.Comment: arXiv admin note: text overlap with arXiv:1704.0735
    corecore