178 research outputs found
Compact Routing on Internet-Like Graphs
The Thorup-Zwick (TZ) routing scheme is the first generic stretch-3 routing
scheme delivering a nearly optimal local memory upper bound. Using both direct
analysis and simulation, we calculate the stretch distribution of this routing
scheme on random graphs with power-law node degree distributions, . We find that the average stretch is very low and virtually
independent of . In particular, for the Internet interdomain graph,
, the average stretch is around 1.1, with up to 70% of paths
being shortest. As the network grows, the average stretch slowly decreases. The
routing table is very small, too. It is well below its upper bounds, and its
size is around 50 records for -node networks. Furthermore, we find that
both the average shortest path length (i.e. distance) and width of
the distance distribution observed in the real Internet inter-AS graph
have values that are very close to the minimums of the average stretch in the
- and -directions. This leads us to the discovery of a unique
critical quasi-stationary point of the average TZ stretch as a function of
and . The Internet distance distribution is located in a
close neighborhood of this point. This observation suggests the analytical
structure of the average stretch function may be an indirect indicator of some
hidden optimization criteria influencing the Internet's interdomain topology
evolution.Comment: 29 pages, 16 figure
From BGP to RTT and Beyond: Matching BGP Routing Changes and Network Delay Variations with an Eye on Traceroute Paths
Many organizations have the mission of assessing the quality of broadband
access services offered by Internet Service Providers (ISPs). They deploy
network probes that periodically perform network measures towards selected
Internet services. By analyzing the data collected by the probes it is often
possible to gain a reasonable estimate of the bandwidth made available by the
ISP. However, it is much more difficult to use such data to explain who is
responsible of the fluctuations of other network qualities. This is especially
true for latency, that is fundamental for several nowadays network services. On
the other hand, there are many publicly accessible BGP routers that collect the
history of routing changes and that are good candidates to be used for
understanding if latency fluctuations depend on interdomain routing.
In this paper we provide a methodology that, given a probe that is located
inside the network of an ISP and that executes latency measures and given a set
of publicly accessible BGP routers located inside the same ISP, decides which
routers are best candidates (if any) for studying the relationship between
variations of network performance recorded by the probe and interdomain routing
changes. We validate the methodology with experimental studies based on data
gathered by the RIPE NCC, an organization that is well-known to be independent
and that publishes both BGP data within the Routing Information Service (RIS)
and probe measurement data within the Atlas project
Distributed Computing with Adaptive Heuristics
We use ideas from distributed computing to study dynamic environments in
which computational nodes, or decision makers, follow adaptive heuristics (Hart
2005), i.e., simple and unsophisticated rules of behavior, e.g., repeatedly
"best replying" to others' actions, and minimizing "regret", that have been
extensively studied in game theory and economics. We explore when convergence
of such simple dynamics to an equilibrium is guaranteed in asynchronous
computational environments, where nodes can act at any time. Our research
agenda, distributed computing with adaptive heuristics, lies on the borderline
of computer science (including distributed computing and learning) and game
theory (including game dynamics and adaptive heuristics). We exhibit a general
non-termination result for a broad class of heuristics with bounded
recall---that is, simple rules of behavior that depend only on recent history
of interaction between nodes. We consider implications of our result across a
wide variety of interesting and timely applications: game theory, circuit
design, social networks, routing and congestion control. We also study the
computational and communication complexity of asynchronous dynamics and present
some basic observations regarding the effects of asynchrony on no-regret
dynamics. We believe that our work opens a new avenue for research in both
distributed computing and game theory.Comment: 36 pages, four figures. Expands both technical results and discussion
of v1. Revised version will appear in the proceedings of Innovations in
Computer Science 201
Dynamics on Games: Simulation-Based Techniques and Applications to Routing
We consider multi-player games played on graphs, in which the players aim at fulfilling their own (not necessarily antagonistic) objectives. In the spirit of evolutionary game theory, we suppose that the players have the right to repeatedly update their respective strategies (for instance, to improve the outcome w.r.t. the current strategy profile). This generates a dynamics in the game which may eventually stabilise to an equilibrium. The objective of the present paper is twofold. First, we aim at drawing a general framework to reason about the termination of such dynamics. In particular, we identify preorders on games (inspired from the classical notion of simulation between transitions systems, and from the notion of graph minor) which preserve termination of dynamics. Second, we show the applicability of the previously developed framework to interdomain routing problems
Scalable Peer-to-Peer Streaming for Live Entertainment Content
We present a system for streaming live entertainment content over the Internet originating from a single source to a scalable number of consumers without resorting to centralized or provider-provisioned resources. The system creates a peer-to-peer overlay network, which attempts to optimize use of existing capacity to ensure quality of service, delivering low startup delay and lag in playout of the live content. There are three main aspects of our solution: first, a swarming mechanism that constructs an overlay topology for minimizing propagation delays from the source to end consumers; second, a distributed overlay anycast system that uses a location-based search algorithm for peers to quickly find the closest peers in a given stream; and finally, a novel incentive mechanism that encourages peers to donate capacity even when the user is not actively consuming content
Macro- and microscopic analysis of the internet economy from network measurements
Tesi per compendi de publicacions.The growth of the Internet impacts multiple areas of the world economy, and it has become a permanent part of the economic landscape both at the macro- and at microeconomic level. On-line traffic and information are currently assets with large business value. Even though commercial Internet has been a part of our lives for more than two decades, its impact on global, and everyday, economy still holds many unknowns.
In this work we analyse important macro- and microeconomic aspects of the Internet. First we investigate the characteristics of the interdomain traffic, which is an important part of the macroscopic economy of the Internet. Finally, we investigate the microeconomic phenomena of price discrimination in the Internet.
At the macroscopic level, we describe quantitatively the interdomain traffic matrix (ITM), as seen from the perspective of a large research network. The ITM describes the traffic flowing between autonomous systems (AS) in the Internet. It depicts the traffic between the largest Internet business entities, therefore it has an important impact on the Internet economy. In particular, we analyse the sparsity and statistical distribution of the traffic, and observe that the shape of the statistical distribution of the traffic sourced from an AS might be related to congestion within the network. We also investigate the correlations between rows in the ITM. Finally, we propose a novel method to model the interdomain traffic, that stems from first-principles and recognizes the fact that the traffic is a mixture of different Internet applications, and can have regional artifacts. We present and evaluate a tool to generate such matrices from open and available data. Our results show that our first-principles approach is a promising alternative to the existing solutions in this area, which enables the investigation of what-if scenarios and their impact on the
Internet economy.
At the microscopic level, we investigate the rising phenomena of price discrimination (PD). We find empirical evidences that Internet users can be subject to price and search discrimination. In particular, we present examples of PD on several ecommerce websites and uncover the information vectors facilitating PD. Later we show that crowd-sourcing is a feasible method to help users to infer if they are subject to PD. We also build and evaluate a system that allows any Internet user to examine if she is subject to PD. The system has been deployed and used by multiple users worldwide, and uncovered more examples of PD.
The methods presented in the following papers are backed with thorough data analysis and experiments.Internet es hoy en día un elemento crucial en la economía mundial, su constante crecimiento afecta directamente múltiples aspectos tanto a nivel macro- como a nivel microeconómico. Entre otros aspectos, el tráfico de red y la información que transporta se han convertido en un producto de gran valor comercial para cualquier empresa. Sin embargo, más de dos decadas después de su introducción en nuestras vidas y siendo un elemento de vital importancia, el impacto de Internet en la economía global y diaria es un tema que alberga todavía muchas incógnitas que resolver. En esta disertación analizamos importantes aspectos micro y macroeconómicos de Internet. Primero, investigamos las características del tráfico entre Sistemas Autónomos (AS), que es un parte decisiva de la macroeconomía de Internet. A continuacin, estudiamos el controvertido fenómeno microeconómico de la discriminación de precios en Internet. A nivel macroeconómico, mostramos cuantitatívamente la matriz del tráfico entre AS ("Interdomain Traffic Matrix - ITM"), visto desde la perspectiva de una gran red científica. La ITM obtenida empíricamente muestra la cantidad de tráfico compartido entre diferentes AS, las entidades más grandes en Internet, siendo esto uno de los principales aspectos a evaluar en la economiá de Internet. Esto nos permite por ejemplo, analizar diferentes propiedades estadísticas del tráfico para descubrir si la distribución del tráfico producido por un AS está directamente relacionado con la congestión dentro de la red. Además, este estudio también nos permite investigar las correlaciones entre filas de la ITM, es decir, entre diferentes AS. Por último, basándonos en el estudio empírico, proponemos una innovadora solución para modelar el tráfico en una ITM, teniendo en cuenta que el tráfico modelado es dependiente de las particularidades de cada escenario (e.g., distribución de apliaciones, artefactos). Para obtener resultados representativos, la herramienta propuesta para crear estas matrices es evaluada a partir de conjuntos de datos abiertos, disponibles para toda la comunidad científica. Los resultados obtenidos muestran que el método propuesto es una prometedora alternativa a las soluciones de la literatura. Permitiendo así, la nueva investigación de escenarios desconocidos y su impacto en la economía de Internet. A nivel microeconómico, en esta tesis investigamos el fenómeno de la discriminación de precios en Internet ("price discrimination" - PD). Nuestros estudios permiten mostrar pruebas empíricas de que los usuarios de Internet están expuestos a discriminación de precios y resultados de búsquedas. En particular, presentamos ejemplos de PD en varias páginas de comercio electrónico y descubrimos que informacin usan para llevarlo a cabo. Posteriormente, mostramos como una herramienta crowdsourcing puede ayudar a la comunidad de usuarios a inferir que páginas aplican prácticas de PD. Con el objetivo de mitigar esta cada vez más común práctica, publicamos y evaluamos una herramienta que permite al usuario deducir si está siendo víctima de PD. Esta herramienta, con gran repercusión mediática, ha sido usada por multitud de usuarios alrededor del mundo, descubriendo así más ejemplos de discriminación. Por último remarcar que todos los metodos presentados en esta disertación están respaldados por rigurosos análisis y experimentos.Postprint (published version
- …