690 research outputs found
Bicomponents and the robustness of networks to failure
A common definition of a robust connection between two nodes in a network
such as a communication network is that there should be at least two
independent paths connecting them, so that the failure of no single node in the
network causes them to become disconnected. This definition leads us naturally
to consider bicomponents, subnetworks in which every node has a robust
connection of this kind to every other. Here we study bicomponents in both real
and model networks using a combination of exact analytic techniques and
numerical methods. We show that standard network models predict there to be
essentially no small bicomponents in most networks, but there may be a giant
bicomponent, whose presence coincides with the presence of the ordinary giant
component, and we find that real networks seem by and large to follow this
pattern, although there are some interesting exceptions. We study the size of
the giant bicomponent as nodes in the network fail, using a specially developed
computer algorithm based on data trees, and find in some cases that our
networks are quite robust to failure, with large bicomponents persisting until
almost all vertices have been removed.Comment: 5 pages, 1 figure, 1 tabl
Super edge-connectivity and matching preclusion of data center networks
Edge-connectivity is a classic measure for reliability of a network in the
presence of edge failures. -restricted edge-connectivity is one of the
refined indicators for fault tolerance of large networks. Matching preclusion
and conditional matching preclusion are two important measures for the
robustness of networks in edge fault scenario. In this paper, we show that the
DCell network is super- for and ,
super- for and , or and , and
super- for and . Moreover, as an application of
-restricted edge-connectivity, we study the matching preclusion number and
conditional matching preclusion number, and characterize the corresponding
optimal solutions of . In particular, we have shown that is
isomorphic to the -star graph for .Comment: 20 pages, 1 figur
Dynamic Effects Increasing Network Vulnerability to Cascading Failures
We study cascading failures in networks using a dynamical flow model based on
simple conservation and distribution laws to investigate the impact of
transient dynamics caused by the rebalancing of loads after an initial network
failure (triggering event). It is found that considering the flow dynamics may
imply reduced network robustness compared to previous static overload failure
models. This is due to the transient oscillations or overshooting in the loads,
when the flow dynamics adjusts to the new (remaining) network structure. We
obtain {\em upper} and {\em lower} limits to network robustness, and it is
shown that {\it two} time scales and , defined by the network
dynamics, are important to consider prior to accurately addressing network
robustness or vulnerability. The robustness of networks showing cascading
failures is generally determined by a complex interplay between the network
topology and flow dynamics, where the ratio determines the
relative role of the two of them.Comment: 4 pages Latex, 4 figure
Threshold for the Outbreak of Cascading Failures in Degree-degree Uncorrelated Networks
In complex networks, the failure of one or very few nodes may cause cascading
failures. When this dynamical process stops in steady state, the size of the
giant component formed by remaining un-failed nodes can be used to measure the
severity of cascading failures, which is critically important for estimating
the robustness of networks. In this paper, we provide a cascade of overload
failure model with local load sharing mechanism, and then explore the threshold
of node capacity when the large-scale cascading failures happen and un-failed
nodes in steady state cannot connect to each other to form a large connected
sub-network. We get the theoretical derivation of this threshold in
degree-degree uncorrelated networks, and validate the effectiveness of this
method in simulation. This threshold provide us a guidance to improve the
network robustness under the premise of limited capacity resource when creating
a network and assigning load. Therefore, this threshold is useful and important
to analyze the robustness of networks.Comment: 11 pages, 4 figure
Optimization of Robustness of Complex Networks
Networks with a given degree distribution may be very resilient to one type
of failure or attack but not to another. The goal of this work is to determine
network design guidelines which maximize the robustness of networks to both
random failure and intentional attack while keeping the cost of the network
(which we take to be the average number of links per node) constant. We find
optimal parameters for: (i) scale free networks having degree distributions
with a single power-law regime, (ii) networks having degree distributions with
two power-law regimes, and (iii) networks described by degree distributions
containing two peaks. Of these various kinds of distributions we find that the
optimal network design is one in which all but one of the nodes have the same
degree, (close to the average number of links per node), and one node is
of very large degree, , where is the number of nodes in
the network.Comment: Accepted for publication in European Physical Journal
- …