41,010 research outputs found
Towards a Rigorous Methodology for Measuring Adoption of RPKI Route Validation and Filtering
A proposal to improve routing security---Route Origin Authorization
(ROA)---has been standardized. A ROA specifies which network is allowed to
announce a set of Internet destinations. While some networks now specify ROAs,
little is known about whether other networks check routes they receive against
these ROAs, a process known as Route Origin Validation (ROV). Which networks
blindly accept invalid routes? Which reject them outright? Which de-preference
them if alternatives exist?
Recent analysis attempts to use uncontrolled experiments to characterize ROV
adoption by comparing valid routes and invalid routes. However, we argue that
gaining a solid understanding of ROV adoption is impossible using currently
available data sets and techniques. Our measurements suggest that, although
some ISPs are not observed using invalid routes in uncontrolled experiments,
they are actually using different routes for (non-security) traffic engineering
purposes, without performing ROV. We conclude with a description of a
controlled, verifiable methodology for measuring ROV and present three ASes
that do implement ROV, confirmed by operators
An Adaptive Policy Management Approach to BGP Convergence
The Border Gateway Protocol (BGP) is the current inter-domain routing protocol used to exchange reachability information between Autonomous Systems (ASes) in the Internet. BGP supports policy-based routing which allows each AS to independently adopt a set of local policies that specify which routes it accepts and advertises from/to other networks, as well as which route it prefers when more than one route becomes available. However, independently chosen local policies may cause global conflicts, which result in protocol divergence. In this paper, we propose a new algorithm, called Adaptive Policy Management Scheme (APMS), to resolve policy conflicts in a distributed manner. Akin to distributed feedback control systems, each AS independently classifies the state of the network as either conflict-free or potentially-conflicting by observing its local history only (namely, route flaps). Based on the degree of measured conflicts (policy conflict-avoidance vs. -control mode), each AS dynamically adjusts its own path preferencesâincreasing its preference for observably stable paths over flapping paths. APMS also includes a mechanism to distinguish route flaps due to topology changes, so as not to confuse them with those due to policy conflicts. A correctness and convergence analysis of APMS based on the substability property of chosen paths is presented. Implementation in the SSF network simulator is performed, and simulation results for different performance metrics are presented. The metrics capture the dynamic performance (in terms of instantaneous throughput, delay, routing load, etc.) of APMS and other competing solutions, thus exposing the often neglected aspects of performance.National Science Foundation (ANI-0095988, EIA-0202067, ITR ANI-0205294
Multilevel MDA-Lite Paris Traceroute
Since its introduction in 2006-2007, Paris Traceroute and its Multipath
Detection Algorithm (MDA) have been used to conduct well over a billion IP
level multipath route traces from platforms such as M-Lab. Unfortunately, the
MDA requires a large number of packets in order to trace an entire topology of
load balanced paths between a source and a destination, which makes it
undesirable for platforms that otherwise deploy Paris Traceroute, such as RIPE
Atlas. In this paper we present a major update to the Paris Traceroute tool.
Our contributions are: (1) MDA-Lite, an alternative to the MDA that
significantly cuts overhead while maintaining a low failure probability; (2)
Fakeroute, a simulator that enables validation of a multipath route tracing
tool's adherence to its claimed failure probability bounds; (3) multilevel
multipath route tracing, with, for the first time, a Traceroute tool that
provides a router-level view of multipath routes; and (4) surveys at both the
IP and router levels of multipath routing in the Internet, showing, among other
things, that load balancing topologies have increased in size well beyond what
has been previously reported as recently as 2016. The data and the software
underlying these results are publicly available.Comment: Preprint. To appear in Proc. ACM Internet Measurement Conference 201
On the Role of Social Identity and Cohesion in Characterizing Online Social Communities
Two prevailing theories for explaining social group or community structure
are cohesion and identity. The social cohesion approach posits that social
groups arise out of an aggregation of individuals that have mutual
interpersonal attraction as they share common characteristics. These
characteristics can range from common interests to kinship ties and from social
values to ethnic backgrounds. In contrast, the social identity approach posits
that an individual is likely to join a group based on an intrinsic
self-evaluation at a cognitive or perceptual level. In other words group
members typically share an awareness of a common category membership.
In this work we seek to understand the role of these two contrasting theories
in explaining the behavior and stability of social communities in Twitter. A
specific focal point of our work is to understand the role of these theories in
disparate contexts ranging from disaster response to socio-political activism.
We extract social identity and social cohesion features-of-interest for large
scale datasets of five real-world events and examine the effectiveness of such
features in capturing behavioral characteristics and the stability of groups.
We also propose a novel measure of social group sustainability based on the
divergence in group discussion. Our main findings are: 1) Sharing of social
identities (especially physical location) among group members has a positive
impact on group sustainability, 2) Structural cohesion (represented by high
group density and low average shortest path length) is a strong indicator of
group sustainability, and 3) Event characteristics play a role in shaping group
sustainability, as social groups in transient events behave differently from
groups in events that last longer
Non-linear optical frequency conversion crystals for space applications
Reliable, long term operation of high-power laser systems in the Earth orbit is not a straightforward task as the space environment entails various risks for optical surfaces and bulk materials. The increased operational risk is, among others, due to the presence of high energy radiation penetrating the metallic shielding of satellites and inducing absorption centers in the bulk of optical components, and vacuum exposure which can deteriorate coating performance. Comprehensive testing for analyzing high-energy radiation effects and mitigation procedures were performed on a set of frequency conversion crystals and are discussed in this paper. In addition to a general resistance to space environmental effects, the frequency conversion crystals were subjected to a comparative analysis on optimum third harmonic efficiency, starting from pulsed 1064 nm laser radiation, with the goal of exceeding a value of 30 %. Concomitant modeling supported the selection of crystal parameters and the definition of crystal dimensions
Computing the Kullback-Leibler Divergence between two Weibull Distributions
We derive a closed form solution for the Kullback-Leibler divergence between
two Weibull distributions. These notes are meant as reference material and
intended to provide a guided tour towards a result that is often mentioned but
seldom made explicit in the literature
Network Sampling: From Static to Streaming Graphs
Network sampling is integral to the analysis of social, information, and
biological networks. Since many real-world networks are massive in size,
continuously evolving, and/or distributed in nature, the network structure is
often sampled in order to facilitate study. For these reasons, a more thorough
and complete understanding of network sampling is critical to support the field
of network science. In this paper, we outline a framework for the general
problem of network sampling, by highlighting the different objectives,
population and units of interest, and classes of network sampling methods. In
addition, we propose a spectrum of computational models for network sampling
methods, ranging from the traditionally studied model based on the assumption
of a static domain to a more challenging model that is appropriate for
streaming domains. We design a family of sampling methods based on the concept
of graph induction that generalize across the full spectrum of computational
models (from static to streaming) while efficiently preserving many of the
topological properties of the input graphs. Furthermore, we demonstrate how
traditional static sampling algorithms can be modified for graph streams for
each of the three main classes of sampling methods: node, edge, and
topology-based sampling. Our experimental results indicate that our proposed
family of sampling methods more accurately preserves the underlying properties
of the graph for both static and streaming graphs. Finally, we study the impact
of network sampling algorithms on the parameter estimation and performance
evaluation of relational classification algorithms
Efficiency of Scale-Free Networks: Error and Attack Tolerance
The concept of network efficiency, recently proposed to characterize the
properties of small-world networks, is here used to study the effects of errors
and attacks on scale-free networks. Two different kinds of scale-free networks,
i.e. networks with power law P(k), are considered: 1) scale-free networks with
no local clustering produced by the Barabasi-Albert model and 2) scale-free
networks with high clustering properties as in the model by Klemm and Eguiluz,
and their properties are compared to the properties of random graphs
(exponential graphs). By using as mathematical measures the global and the
local efficiency we investigate the effects of errors and attacks both on the
global and the local properties of the network. We show that the global
efficiency is a better measure than the characteristic path length to describe
the response of complex networks to external factors. We find that, at variance
with random graphs, scale-free networks display, both on a global and on a
local scale, a high degree of error tolerance and an extreme vulnerability to
attacks. In fact, the global and the local efficiency are unaffected by the
failure of some randomly chosen nodes, though they are extremely sensititive to
the removal of the few nodes which play a crucial role in maintaining the
network's connectivity.Comment: 23 pages, 10 figure
- âŠ