3,070 research outputs found

    Scale-free network growth by ranking

    Full text link
    Network growth is currently explained through mechanisms that rely on node prestige measures, such as degree or fitness. In many real networks those who create and connect nodes do not know the prestige values of existing nodes, but only their ranking by prestige. We propose a criterion of network growth that explicitly relies on the ranking of the nodes according to any prestige measure, be it topological or not. The resulting network has a scale-free degree distribution when the probability to link a target node is any power law function of its rank, even when one has only partial information of node ranks. Our criterion may explain the frequency and robustness of scale-free degree distributions in real networks, as illustrated by the special case of the Web graph.Comment: 4 pages, 2 figures. We extended the model to account for ranking by arbitrarily distributed fitness. Final version to appear on Physical Review Letter

    Information filtering in complex weighted networks

    Get PDF
    Many systems in nature, society and technology can be described as networks, where the vertices are the system's elements and edges between vertices indicate the interactions between the corresponding elements. Edges may be weighted if the interaction strength is measurable. However, the full network information is often redundant because tools and techniques from network analysis do not work or become very inefficient if the network is too dense and some weights may just reflect measurement errors, and shall be discarded. Moreover, since weight distributions in many complex weighted networks are broad, most of the weight is concentrated among a small fraction of all edges. It is then crucial to properly detect relevant edges. Simple thresholding would leave only the largest weights, disrupting the multiscale structure of the system, which is at the basis of the structure of complex networks, and ought to be kept. In this paper we propose a weight filtering technique based on a global null model (GloSS filter), keeping both the weight distribution and the full topological structure of the network. The method correctly quantifies the statistical significance of weights assigned independently to the edges from a given distribution. Applications to real networks reveal that the GloSS filter is indeed able to identify relevantconnections between vertices.Comment: 9 pages, 7 figures, 1 Table. The GloSS filter is implemented in a freely downloadable software (http://filrad.homelinux.org/resources

    Stable and Efficient Structures for the Content Production and Consumption in Information Communities

    Full text link
    Real-world information communities exhibit inherent structures that characterize a system that is stable and efficient for content production and consumption. In this paper, we study such structures through mathematical modelling and analysis. We formulate a generic model of a community in which each member decides how they allocate their time between content production and consumption with the objective of maximizing their individual reward. We define the community system as "stable and efficient" when a Nash equilibrium is reached while the social welfare of the community is maximized. We investigate the conditions for forming a stable and efficient community under two variations of the model representing different internal relational structures of the community. Our analysis results show that the structure with "a small core of celebrity producers" is the optimally stable and efficient for a community. These analysis results provide possible explanations to the sociological observations such as "the Law of the Few" and also provide insights into how to effectively build and maintain the structure of information communities.Comment: 21 page

    Center clusters in the Yang-Mills vacuum

    Full text link
    Properties of local Polyakov loops for SU(2) and SU(3) lattice gauge theory at finite temperature are analyzed. We show that spatial clusters can be identified where the local Polyakov loops have values close to the same center element. For a suitable definition of these clusters the deconfinement transition can be characterized by the onset of percolation in one of the center sectors. The analysis is repeated for different resolution scales of the lattice and we argue that the center clusters have a continuum limit.Comment: Table added. Final version to appear in JHE

    Continuum discretized BCS approach for weakly bound nuclei

    Get PDF
    The Bardeen-Cooper-Schrieffer (BCS) formalism is extended by including the single-particle continuum in order to analyse the evolution of pairing in an isotopic chain from stability up to the drip line. We propose a continuum discretized generalized BCS based on single-particle pseudostates (PS). These PS are generated from the diagonalization of the single-particle Hamiltonian within a Transformed Harmonic Oscillator (THO) basis. The consistency of the results versus the size of the basis is studied. The method is applied to neutron rich Oxygen and Carbon isotopes and compared with similar previous works and available experimental data. We make use of the flexibility of the proposed model in order to study the evolution of the occupation of the low-energy continuum when the system becomes weakly bound. We find a larger influence of the non-resonant continuum as long as the Fermi level approaches zero.Comment: 20 pages, 16 figures, to be submitte

    Distributed Graph Clustering using Modularity and Map Equation

    Full text link
    We study large-scale, distributed graph clustering. Given an undirected graph, our objective is to partition the nodes into disjoint sets called clusters. A cluster should contain many internal edges while being sparsely connected to other clusters. In the context of a social network, a cluster could be a group of friends. Modularity and map equation are established formalizations of this internally-dense-externally-sparse principle. We present two versions of a simple distributed algorithm to optimize both measures. They are based on Thrill, a distributed big data processing framework that implements an extended MapReduce model. The algorithms for the two measures, DSLM-Mod and DSLM-Map, differ only slightly. Adapting them for similar quality measures is straight-forward. We conduct an extensive experimental study on real-world graphs and on synthetic benchmark graphs with up to 68 billion edges. Our algorithms are fast while detecting clusterings similar to those detected by other sequential, parallel and distributed clustering algorithms. Compared to the distributed GossipMap algorithm, DSLM-Map needs less memory, is up to an order of magnitude faster and achieves better quality.Comment: 14 pages, 3 figures; v3: Camera ready for Euro-Par 2018, more details, more results; v2: extended experiments to include comparison with competing algorithms, shortened for submission to Euro-Par 201

    Modularity functions maximization with nonnegative relaxation facilitates community detection in networks

    Full text link
    We show here that the problem of maximizing a family of quantitative functions, encompassing both the modularity (Q-measure) and modularity density (D-measure), for community detection can be uniformly understood as a combinatoric optimization involving the trace of a matrix called modularity Laplacian. Instead of using traditional spectral relaxation, we apply additional nonnegative constraint into this graph clustering problem and design efficient algorithms to optimize the new objective. With the explicit nonnegative constraint, our solutions are very close to the ideal community indicator matrix and can directly assign nodes into communities. The near-orthogonal columns of the solution can be reformulated as the posterior probability of corresponding node belonging to each community. Therefore, the proposed method can be exploited to identify the fuzzy or overlapping communities and thus facilitates the understanding of the intrinsic structure of networks. Experimental results show that our new algorithm consistently, sometimes significantly, outperforms the traditional spectral relaxation approaches
    corecore