4,958 research outputs found
Characterizing and Improving the Reliability of Broadband Internet Access
In this paper, we empirically demonstrate the growing importance of
reliability by measuring its effect on user behavior. We present an approach
for broadband reliability characterization using data collected by many
emerging national initiatives to study broadband and apply it to the data
gathered by the Federal Communications Commission's Measuring Broadband America
project. Motivated by our findings, we present the design, implementation, and
evaluation of a practical approach for improving the reliability of broadband
Internet access with multihoming.Comment: 15 pages, 14 figures, 6 table
QuLa: service selection and forwarding table population in service-centric networking using real-life topologies
The amount of services located in the network has drastically increased over the last decade which is why more and more datacenters are located at the network edge, closer to the users. In the current Internet it is up to the client to select a destination using a resolution service (Domain Name System, Content Delivery Networks ...). In the last few years, research on Information-Centric Networking (ICN) suggests to put this selection responsibility at the network components; routers find the closest copy of a content object using the content name as input.
We extend the principle of ICN to services; service routers forward requests to service instances located in datacenters spread across the network edge. To solve this problem, we first present a service selection algorithm based on both server and network metrics. Next, we describe a method to reduce the state required in service routers while minimizing the performance loss caused by this data reduction. Simulation results based on real-life networks show that we are able to find a near-optimal load distribution with only minimal state required in the service routers
Regulation and competition in the Turkish telecommunications industry: an update
This chapter provides an overview of the state of liberalization, competition and regulation of major segments of the telecommunications industry in Turkey. It shows that the competitive stance of the regulatory authority and the development of actual competition has been uneven across segments. Specifically, the degree of competition has been higher in the mobile segment relative to fixed telephony or broadband. The chapter also discusses the new Electronic Communications Law and argues that although not perfect, it provides a coherent basis on which the regulatory authority can pursue competitive objectives in a more even manner. However, the actual development of competition will depend a lot on how the law and the ensuing secondary legislation are actually implemented
Fairness, NGO Activism and the Welfare of Less Developed Countries
In a world where some consumers are not self-interested and the action of non-governmental organizations (NGOs) reveals information, the price of a good produced by a multinational enterprise and the latter's relocation and production decisions depend on labor standards. We study the effect of an increase in NGO activism on labor standards and welfare in less developed countries (LDC). An increase in NGO activism improves labor practices unless consumers like inequity. A priori, activism could either increase or decrease LDC welfare. We give parameter restrictions that determine which way it moves.
Palgol: A High-Level DSL for Vertex-Centric Graph Processing with Remote Data Access
Pregel is a popular distributed computing model for dealing with large-scale
graphs. However, it can be tricky to implement graph algorithms correctly and
efficiently in Pregel's vertex-centric model, especially when the algorithm has
multiple computation stages, complicated data dependencies, or even
communication over dynamic internal data structures. Some domain-specific
languages (DSLs) have been proposed to provide more intuitive ways to implement
graph algorithms, but due to the lack of support for remote access --- reading
or writing attributes of other vertices through references --- they cannot
handle the above mentioned dynamic communication, causing a class of Pregel
algorithms with fast convergence impossible to implement.
To address this problem, we design and implement Palgol, a more declarative
and powerful DSL which supports remote access. In particular, programmers can
use a more declarative syntax called chain access to naturally specify dynamic
communication as if directly reading data on arbitrary remote vertices. By
analyzing the logic patterns of chain access, we provide a novel algorithm for
compiling Palgol programs to efficient Pregel code. We demonstrate the power of
Palgol by using it to implement several practical Pregel algorithms, and the
evaluation result shows that the efficiency of Palgol is comparable with that
of hand-written code.Comment: 12 pages, 10 figures, extended version of APLAS 2017 pape
Unsupervised String Transformation Learning for Entity Consolidation
Data integration has been a long-standing challenge in data management with
many applications. A key step in data integration is entity consolidation. It
takes a collection of clusters of duplicate records as input and produces a
single "golden record" for each cluster, which contains the canonical value for
each attribute. Truth discovery and data fusion methods, as well as Master Data
Management (MDM) systems, can be used for entity consolidation. However, to
achieve better results, the variant values (i.e., values that are logically the
same with different formats) in the clusters need to be consolidated before
applying these methods.
For this purpose, we propose a data-driven method to standardize the variant
values based on two observations: (1) the variant values usually can be
transformed to the same representation (e.g., "Mary Lee" and "Lee, Mary") and
(2) the same transformation often appears repeatedly across different clusters
(e.g., transpose the first and last name). Our approach first uses an
unsupervised method to generate groups of value pairs that can be transformed
in the same way (i.e., they share a transformation). Then the groups are
presented to a human for verification and the approved ones are used to
standardize the data. In a real-world dataset with 17,497 records, our method
achieved 75% recall and 99.5% precision in standardizing variant values by
asking a human 100 yes/no questions, which completely outperformed a state of
the art data wrangling tool
The Nature, Timing and Impact of Broadband Policies: a Panel Analysis of 30 OECD Countries
We empirically investigate the impact of a vast array of public policies on wireline broadband penetration through a novel and unique dataset covering 30 OECD countries, over 1995-2010. We find that while both supply and demand-side policies have a positive effect on broadband penetration, their relative impact depends on the actual stage of broadband diffusion. When an advanced stage is reached, only demand-side policies appear to generate a positive and increasing effect. Moreover, both technological and market competition play a positive role, and the effect of the latter shows a non-linear path along the stage of market development. Finally, the relative weight of the service sector in the national economy reveals to be crucial for broadband penetration. Our analysis provides new insights into the policy debate and in particular on the rationale of a selective policy design for broadband penetration and, in perspective, for the rollout of next-generation networks.telecommunications policies, broadband penetration, infrastructure investments
- …