191 research outputs found

    Sustainable growth in complex networks

    Full text link
    Based on the empirical analysis of the dependency network in 18 Java projects, we develop a novel model of network growth which considers both: an attachment mechanism and the addition of new nodes with a heterogeneous distribution of their initial degree, k0k_0. Empirically we find that the cumulative degree distributions of initial degrees and of the final network, follow power-law behaviors: P(k0)∝k01−αP(k_{0}) \propto k_{0}^{1-\alpha}, and P(k)∝k1−γP(k)\propto k^{1-\gamma}, respectively. For the total number of links as a function of the network size, we find empirically K(N)∝NÎČK(N)\propto N^{\beta}, where ÎČ\beta is (at the beginning of the network evolution) between 1.25 and 2, while converging to ∌1\sim 1 for large NN. This indicates a transition from a growth regime with increasing network density towards a sustainable regime, which revents a collapse because of ever increasing dependencies. Our theoretical framework is able to predict relations between the exponents α\alpha, ÎČ\beta, Îł\gamma, which also link issues of software engineering and developer activity. These relations are verified by means of computer simulations and empirical investigations. They indicate that the growth of real Open Source Software networks occurs on the edge between two regimes, which are either dominated by the initial degree distribution of added nodes, or by the preferential attachment mechanism. Hence, the heterogeneous degree distribution of newly added nodes, found empirically, is essential to describe the laws of sustainable growth in networks.Comment: 5 pages, 2 figures, 1 tabl

    A dissemination strategy for immunizing scale-free networks

    Full text link
    We consider the problem of distributing a vaccine for immunizing a scale-free network against a given virus or worm. We introduce a new method, based on vaccine dissemination, that seems to reflect more accurately what is expected to occur in real-world networks. Also, since the dissemination is performed using only local information, the method can be easily employed in practice. Using a random-graph framework, we analyze our method both mathematically and by means of simulations. We demonstrate its efficacy regarding the trade-off between the expected number of nodes that receive the vaccine and the network's resulting vulnerability to develop an epidemic as the virus or worm attempts to infect one of its nodes. For some scenarios, the new method is seen to render the network practically invulnerable to attacks while requiring only a small fraction of the nodes to receive the vaccine

    Behavior of susceptible-infected-susceptible epidemics on heterogeneous networks with saturation

    Full text link
    We investigate saturation effects in susceptible-infected-susceptible (SIS) models of the spread of epidemics in heterogeneous populations. The structure of interactions in the population is represented by networks with connectivity distribution P(k)P(k),including scale-free(SF) networks with power law distributions P(k)∌k−γP(k)\sim k^{-\gamma}. Considering cases where the transmission of infection between nodes depends on their connectivity, we introduce a saturation function C(k)C(k) which reduces the infection transmission rate λ\lambda across an edge going from a node with high connectivity kk. A mean field approximation with the neglect of degree-degree correlation then leads to a finite threshold λc>0\lambda_{c}>0 for SF networks with 2<γ≀32<\gamma \leq 3. We also find, in this approximation, the fraction of infected individuals among those with degree kk for λ\lambda close to λc\lambda_{c}. We investigate via computer simulation the contact process on a heterogeneous regular lattice and compare the results with those obtained from mean field theory with and without neglect of degree-degree correlations.Comment: 6 figure

    Content-Based Image Retrieval By Relevance Feedback

    Full text link

    From Cooperative Scans to Predictive Buffer Management

    Get PDF
    In analytical applications, database systems often need to sustain workloads with multiple concurrent scans hitting the same table. The Cooperative Scans (CScans) framework, which introduces an Active Buffer Manager (ABM) component into the database architecture, has been the most effective and elaborate response to this problem, and was initially developed in the X100 research prototype. We now report on the the experiences of integrating Cooperative Scans into its industrial-strength successor, the Vectorwise database product. During this implementation we invented a simpler optimization of concurrent scan buffer management, called Predictive Buffer Management (PBM). PBM is based on the observation that in a workload with long-running scans, the buffer manager has quite a bit of information on the workload in the immediate future, such that an approximation of the ideal OPT algorithm becomes feasible. In the evaluation on both synthetic benchmarks as well as a TPC-H throughput run we compare the benefits of naive buffer management (LRU) versus CScans, PBM and OPT; showing that PBM achieves benefits close to Cooperative Scans, while incurring much lower architectural impact.Comment: VLDB201

    Using schema transformation pathways for data lineage tracing

    Get PDF
    With the increasing amount and diversity of information available on the Internet, there has been a huge growth in information systems that need to integrate data from distributed, heterogeneous data sources. Tracing the lineage of the integrated data is one of the problems being addressed in data warehousing research. This paper presents a data lineage tracing approach based on schema transformation pathways. Our approach is not limited to one specific data model or query language, and would be useful in any data transformation/integration framework based on sequences of primitive schema transformations

    Optimal network topologies for local search with congestion

    Get PDF
    The problem of searchability in decentralized complex networks is of great importance in computer science, economy and sociology. We present a formalism that is able to cope simultaneously with the problem of search and the congestion effects that arise when parallel searches are performed, and obtain expressions for the average search cost--written in terms of the search algorithm and the topological properties of the network--both in presence and abscence of congestion. This formalism is used to obtain optimal network structures for a system using a local search algorithm. It is found that only two classes of networks can be optimal: star-like configurations, when the number of parallel searches is small, and homogeneous-isotropic configurations, when the number of parallel searches is large.Comment: 4 pages. Final version accepted in PR

    Scale free networks of earthquakes and aftershocks

    Full text link
    We propose a new metric to quantify the correlation between any two earthquakes. The metric consists of a product involving the time interval and spatial distance between two events, as well as the magnitude of the first one. According to this metric, events typically are strongly correlated to only one or a few preceding ones. Thus a classification of events as foreshocks, main shocks or aftershocks emerges automatically without imposing predefined space-time windows. To construct a network, each earthquake receives an incoming link from its most correlated predecessor. The number of aftershocks for any event, identified by its outgoing links, is found to be scale free with exponent Îł=2.0(1)\gamma = 2.0(1). The original Omori law with p=1p=1 emerges as a robust feature of seismicity, holding up to years even for aftershock sequences initiated by intermediate magnitude events. The measured fat-tailed distribution of distances between earthquakes and their aftershocks suggests that aftershock collection with fixed space windows is not appropriate.Comment: 7 pages and 7 figures. Submitte

    Halting viruses in scale-free networks

    Full text link
    The vanishing epidemic threshold for viruses spreading on scale-free networks indicate that traditional methods, aiming to decrease a virus' spreading rate cannot succeed in eradicating an epidemic. We demonstrate that policies that discriminate between the nodes, curing mostly the highly connected nodes, can restore a finite epidemic threshold and potentially eradicate a virus. We find that the more biased a policy is towards the hubs, the more chance it has to bring the epidemic threshold above the virus' spreading rate. Furthermore, such biased policies are more cost effective, requiring less cures to eradicate the virus

    Synchronization in Weighted Uncorrelated Complex Networks in a Noisy Environment: Optimization and Connections with Transport Efficiency

    Full text link
    Motivated by synchronization problems in noisy environments, we study the Edwards-Wilkinson process on weighted uncorrelated scale-free networks. We consider a specific form of the weights, where the strength (and the associated cost) of a link is proportional to (kikj)ÎČ(k_{i}k_{j})^{\beta} with kik_{i} and kjk_{j} being the degrees of the nodes connected by the link. Subject to the constraint that the total network cost is fixed, we find that in the mean-field approximation on uncorrelated scale-free graphs, synchronization is optimal at ÎČ∗\beta^{*}==-1. Numerical results, based on exact numerical diagonalization of the corresponding network Laplacian, confirm the mean-field results, with small corrections to the optimal value of ÎČ∗\beta^{*}. Employing our recent connections between the Edwards-Wilkinson process and resistor networks, and some well-known connections between random walks and resistor networks, we also pursue a naturally related problem of optimizing performance in queue-limited communication networks utilizing local weighted routing schemes.Comment: Papers on related research can be found at http://www.rpi.edu/~korniss/Research
    • 

    corecore