191 research outputs found
Sustainable growth in complex networks
Based on the empirical analysis of the dependency network in 18 Java
projects, we develop a novel model of network growth which considers both: an
attachment mechanism and the addition of new nodes with a heterogeneous
distribution of their initial degree, . Empirically we find that the
cumulative degree distributions of initial degrees and of the final network,
follow power-law behaviors: , and
, respectively. For the total number of links as a
function of the network size, we find empirically ,
where is (at the beginning of the network evolution) between 1.25 and
2, while converging to for large . This indicates a transition from
a growth regime with increasing network density towards a sustainable regime,
which revents a collapse because of ever increasing dependencies. Our
theoretical framework is able to predict relations between the exponents
, , , which also link issues of software engineering and
developer activity. These relations are verified by means of computer
simulations and empirical investigations. They indicate that the growth of real
Open Source Software networks occurs on the edge between two regimes, which are
either dominated by the initial degree distribution of added nodes, or by the
preferential attachment mechanism. Hence, the heterogeneous degree distribution
of newly added nodes, found empirically, is essential to describe the laws of
sustainable growth in networks.Comment: 5 pages, 2 figures, 1 tabl
A dissemination strategy for immunizing scale-free networks
We consider the problem of distributing a vaccine for immunizing a scale-free
network against a given virus or worm. We introduce a new method, based on
vaccine dissemination, that seems to reflect more accurately what is expected
to occur in real-world networks. Also, since the dissemination is performed
using only local information, the method can be easily employed in practice.
Using a random-graph framework, we analyze our method both mathematically and
by means of simulations. We demonstrate its efficacy regarding the trade-off
between the expected number of nodes that receive the vaccine and the network's
resulting vulnerability to develop an epidemic as the virus or worm attempts to
infect one of its nodes. For some scenarios, the new method is seen to render
the network practically invulnerable to attacks while requiring only a small
fraction of the nodes to receive the vaccine
Behavior of susceptible-infected-susceptible epidemics on heterogeneous networks with saturation
We investigate saturation effects in susceptible-infected-susceptible (SIS)
models of the spread of epidemics in heterogeneous populations. The structure
of interactions in the population is represented by networks with connectivity
distribution ,including scale-free(SF) networks with power law
distributions . Considering cases where the transmission
of infection between nodes depends on their connectivity, we introduce a
saturation function which reduces the infection transmission rate
across an edge going from a node with high connectivity . A mean
field approximation with the neglect of degree-degree correlation then leads to
a finite threshold for SF networks with . We
also find, in this approximation, the fraction of infected individuals among
those with degree for close to . We investigate via
computer simulation the contact process on a heterogeneous regular lattice and
compare the results with those obtained from mean field theory with and without
neglect of degree-degree correlations.Comment: 6 figure
From Cooperative Scans to Predictive Buffer Management
In analytical applications, database systems often need to sustain workloads
with multiple concurrent scans hitting the same table. The Cooperative Scans
(CScans) framework, which introduces an Active Buffer Manager (ABM) component
into the database architecture, has been the most effective and elaborate
response to this problem, and was initially developed in the X100 research
prototype. We now report on the the experiences of integrating Cooperative
Scans into its industrial-strength successor, the Vectorwise database product.
During this implementation we invented a simpler optimization of concurrent
scan buffer management, called Predictive Buffer Management (PBM). PBM is based
on the observation that in a workload with long-running scans, the buffer
manager has quite a bit of information on the workload in the immediate future,
such that an approximation of the ideal OPT algorithm becomes feasible. In the
evaluation on both synthetic benchmarks as well as a TPC-H throughput run we
compare the benefits of naive buffer management (LRU) versus CScans, PBM and
OPT; showing that PBM achieves benefits close to Cooperative Scans, while
incurring much lower architectural impact.Comment: VLDB201
Using schema transformation pathways for data lineage tracing
With the increasing amount and diversity of information available on the Internet, there has been a huge growth in information systems that need to integrate data from distributed, heterogeneous data sources. Tracing the lineage of the integrated data is one of the problems being addressed in data warehousing research. This paper presents a data lineage tracing approach based on schema transformation pathways. Our approach is not limited to one specific data model or query language, and would be useful in any data transformation/integration framework based on sequences of primitive schema transformations
Optimal network topologies for local search with congestion
The problem of searchability in decentralized complex networks is of great
importance in computer science, economy and sociology. We present a formalism
that is able to cope simultaneously with the problem of search and the
congestion effects that arise when parallel searches are performed, and obtain
expressions for the average search cost--written in terms of the search
algorithm and the topological properties of the network--both in presence and
abscence of congestion. This formalism is used to obtain optimal network
structures for a system using a local search algorithm. It is found that only
two classes of networks can be optimal: star-like configurations, when the
number of parallel searches is small, and homogeneous-isotropic configurations,
when the number of parallel searches is large.Comment: 4 pages. Final version accepted in PR
Scale free networks of earthquakes and aftershocks
We propose a new metric to quantify the correlation between any two
earthquakes. The metric consists of a product involving the time interval and
spatial distance between two events, as well as the magnitude of the first one.
According to this metric, events typically are strongly correlated to only one
or a few preceding ones. Thus a classification of events as foreshocks, main
shocks or aftershocks emerges automatically without imposing predefined
space-time windows. To construct a network, each earthquake receives an
incoming link from its most correlated predecessor. The number of aftershocks
for any event, identified by its outgoing links, is found to be scale free with
exponent . The original Omori law with emerges as a
robust feature of seismicity, holding up to years even for aftershock sequences
initiated by intermediate magnitude events. The measured fat-tailed
distribution of distances between earthquakes and their aftershocks suggests
that aftershock collection with fixed space windows is not appropriate.Comment: 7 pages and 7 figures. Submitte
Halting viruses in scale-free networks
The vanishing epidemic threshold for viruses spreading on scale-free networks
indicate that traditional methods, aiming to decrease a virus' spreading rate
cannot succeed in eradicating an epidemic. We demonstrate that policies that
discriminate between the nodes, curing mostly the highly connected nodes, can
restore a finite epidemic threshold and potentially eradicate a virus. We find
that the more biased a policy is towards the hubs, the more chance it has to
bring the epidemic threshold above the virus' spreading rate. Furthermore, such
biased policies are more cost effective, requiring less cures to eradicate the
virus
Urban Gravity: a Model for Intercity Telecommunication Flows
We analyze the anonymous communication patterns of 2.5 million customers of a
Belgian mobile phone operator. Grouping customers by billing address, we build
a social network of cities, that consists of communications between 571 cities
in Belgium. We show that inter-city communication intensity is characterized by
a gravity model: the communication intensity between two cities is proportional
to the product of their sizes divided by the square of their distance
- âŠ