102 research outputs found
Graph Annotations in Modeling Complex Network Topologies
The coarsest approximation of the structure of a complex network, such as the
Internet, is a simple undirected unweighted graph. This approximation, however,
loses too much detail. In reality, objects represented by vertices and edges in
such a graph possess some non-trivial internal structure that varies across and
differentiates among distinct types of links or nodes. In this work, we
abstract such additional information as network annotations. We introduce a
network topology modeling framework that treats annotations as an extended
correlation profile of a network. Assuming we have this profile measured for a
given network, we present an algorithm to rescale it in order to construct
networks of varying size that still reproduce the original measured annotation
profile.
Using this methodology, we accurately capture the network properties
essential for realistic simulations of network applications and protocols, or
any other simulations involving complex network topologies, including modeling
and simulation of network evolution. We apply our approach to the Autonomous
System (AS) topology of the Internet annotated with business relationships
between ASs. This topology captures the large-scale structure of the Internet.
In depth understanding of this structure and tools to model it are cornerstones
of research on future Internet architectures and designs. We find that our
techniques are able to accurately capture the structure of annotation
correlations within this topology, thus reproducing a number of its important
properties in synthetically-generated random graphs
Resilience of the Internet to random breakdowns
A common property of many large networks, including the Internet, is that the
connectivity of the various nodes follows a scale-free power-law distribution,
P(k)=ck^-a. We study the stability of such networks with respect to crashes,
such as random removal of sites. Our approach, based on percolation theory,
leads to a general condition for the critical fraction of nodes, p_c, that need
to be removed before the network disintegrates. We show that for a<=3 the
transition never takes place, unless the network is finite. In the special case
of the Internet (a=2.5), we find that it is impressively robust, where p_c is
approximately 0.99.Comment: latex, 3 pages, 1 figure (eps), explanations added, Phys. Rev. Lett.,
in pres
Error and attack tolerance of complex networks
Many complex systems, such as communication networks, display a surprising
degree of robustness: while key components regularly malfunction, local
failures rarely lead to the loss of the global information-carrying ability of
the network. The stability of these complex systems is often attributed to the
redundant wiring of the functional web defined by the systems' components. In
this paper we demonstrate that error tolerance is not shared by all redundant
systems, but it is displayed only by a class of inhomogeneously wired networks,
called scale-free networks. We find that scale-free networks, describing a
number of systems, such as the World Wide Web, Internet, social networks or a
cell, display an unexpected degree of robustness, the ability of their nodes to
communicate being unaffected by even unrealistically high failure rates.
However, error tolerance comes at a high price: these networks are extremely
vulnerable to attacks, i.e. to the selection and removal of a few nodes that
play the most important role in assuring the network's connectivity.Comment: 14 pages, 4 figures, Late
Universal Behavior of Load Distribution in Scale-free Networks
We study a problem of data packet transport in scale-free networks whose
degree distribution follows a power-law with the exponent . We define
load at each vertex as the accumulated total number of data packets passing
through that vertex when every pair of vertices send and receive a data packet
along the shortest path connecting the pair. It is found that the load
distribution follows a power-law with the exponent ,
insensitive to different values of in the range, ,
and different mean degrees, which is valid for both undirected and directed
cases. Thus, we conjecture that the load exponent is a universal quantity to
characterize scale-free networks.Comment: 5 pages, 5 figures, revised versio
DNA damage in circulating leukocytes measured with the comet assay may predict the risk of death
The comet assay or single cell gel electrophoresis, is the most common method used to measure strand breaks and a variety of other DNA lesions in human populations. To estimate the risk of overall mortality, mortality by cause, and cancer incidence associated to DNA damage, a cohort of 2,403 healthy individuals (25,978 person-years) screened in 16 laboratories using the comet assay between 1996 and 2016 was followed-up. Kaplan-Meier analysis indicated a worse overall survival in the medium and high tertile of DNA damage (p < 0.001). The effect of DNA damage on survival was modelled according to Cox proportional hazard regression model. The adjusted hazard ratio (HR) was 1.42 (1.06-1.90) for overall mortality, and 1.94 (1.04-3.59) for diseases of the circulatory system in subjects with the highest tertile of DNA damage. The findings of this study provide epidemiological evidence encouraging the implementation of the comet assay in preventive strategies for non-communicable diseases
Re-Shape: A Method to Teach Data Ethics for Data Science Education
Data has become central to the technologies and services that human-computer interaction (HCI) designers make, and the ethical use of data in and through these technologies should be given critical attention throughout the design process. However, there is little research on ethics education in computer science that explicitly addresses data ethics. We present and analyze Re-Shape, a method to teach students about the ethical implications of data collection and use. Re-Shape, as part of an educational environment, builds upon the idea of cultivating care and allows students to collect, process, and visualizetheir physical movement data in ways that support critical reflection and coordinated classroom activities about data, data privacy, and human-centered systems for data science. We also use a case study of Re-Shape in an undergraduate computer science course to explore prospects and limitations of instructional designs and educational technology such as Re-Shape that leverage personal data to teach data ethics
- …