2,176 research outputs found

    Characterizing and Improving the Reliability of Broadband Internet Access

    Full text link
    In this paper, we empirically demonstrate the growing importance of reliability by measuring its effect on user behavior. We present an approach for broadband reliability characterization using data collected by many emerging national initiatives to study broadband and apply it to the data gathered by the Federal Communications Commission's Measuring Broadband America project. Motivated by our findings, we present the design, implementation, and evaluation of a practical approach for improving the reliability of broadband Internet access with multihoming.Comment: 15 pages, 14 figures, 6 table

    Measuring internet activity: a (selective) review of methods and metrics

    Get PDF
    Two Decades after the birth of the World Wide Web, more than two billion people around the world are Internet users. The digital landscape is littered with hints that the affordances of digital communications are being leveraged to transform life in profound and important ways. The reach and influence of digitally mediated activity grow by the day and touch upon all aspects of life, from health, education, and commerce to religion and governance. This trend demands that we seek answers to the biggest questions about how digitally mediated communication changes society and the role of different policies in helping or hindering the beneficial aspects of these changes. Yet despite the profusion of data the digital age has brought upon us—we now have access to a flood of information about the movements, relationships, purchasing decisions, interests, and intimate thoughts of people around the world—the distance between the great questions of the digital age and our understanding of the impact of digital communications on society remains large. A number of ongoing policy questions have emerged that beg for better empirical data and analyses upon which to base wider and more insightful perspectives on the mechanics of social, economic, and political life online. This paper seeks to describe the conceptual and practical impediments to measuring and understanding digital activity and highlights a sample of the many efforts to fill the gap between our incomplete understanding of digital life and the formidable policy questions related to developing a vibrant and healthy Internet that serves the public interest and contributes to human wellbeing. Our primary focus is on efforts to measure Internet activity, as we believe obtaining robust, accurate data is a necessary and valuable first step that will lead us closer to answering the vitally important questions of the digital realm. Even this step is challenging: the Internet is difficult to measure and monitor, and there is no simple aggregate measure of Internet activity—no GDP, no HDI. In the following section we present a framework for assessing efforts to document digital activity. The next three sections offer a summary and description of many of the ongoing projects that document digital activity, with two final sections devoted to discussion and conclusions

    From BGP to RTT and Beyond: Matching BGP Routing Changes and Network Delay Variations with an Eye on Traceroute Paths

    Full text link
    Many organizations have the mission of assessing the quality of broadband access services offered by Internet Service Providers (ISPs). They deploy network probes that periodically perform network measures towards selected Internet services. By analyzing the data collected by the probes it is often possible to gain a reasonable estimate of the bandwidth made available by the ISP. However, it is much more difficult to use such data to explain who is responsible of the fluctuations of other network qualities. This is especially true for latency, that is fundamental for several nowadays network services. On the other hand, there are many publicly accessible BGP routers that collect the history of routing changes and that are good candidates to be used for understanding if latency fluctuations depend on interdomain routing. In this paper we provide a methodology that, given a probe that is located inside the network of an ISP and that executes latency measures and given a set of publicly accessible BGP routers located inside the same ISP, decides which routers are best candidates (if any) for studying the relationship between variations of network performance recorded by the probe and interdomain routing changes. We validate the methodology with experimental studies based on data gathered by the RIPE NCC, an organization that is well-known to be independent and that publishes both BGP data within the Routing Information Service (RIS) and probe measurement data within the Atlas project

    Broadbanding Brunswick: High-speed broadband and household media ecologies

    Get PDF
    New research from the University of Melbourne and Swinburne University has found that 82% of households in the NBN first release site of Brunswick, Victoria, think the NBN is a good idea. The study, Broadbanding Brunswick: High-speed Broadband and Household Media Ecologies, examines the take-up, use and implications of high-speed broadband for some of its earliest adopters. It looks at how the adoption of high-speed broadband influences household consumption patterns and use of telecoms. The survey of 282 Brunswick households found there had been a significant uptake of the NBN during the course of the research. In 2011, 20% of households were connected to the NBN and in 2012 that number had risen to 34%. Families, home owners, higher income earners and teleworkers were most likely to adopt the NBN. Many NBN users reported paying less for their monthly internet bills, with 49% paying about the same. In many cases those paying more (37%) had elected to do so.Download report: Broadbanding Brunswick: High-speed Broadband and Household Media Ecologies [PDF, 2.5MB] Download report: Broadbanding Brunswick: High-speed Broadband and Household Media Ecologies [Word 2007 document, 5MB

    Survey of End-to-End Mobile Network Measurement Testbeds, Tools, and Services

    Full text link
    Mobile (cellular) networks enable innovation, but can also stifle it and lead to user frustration when network performance falls below expectations. As mobile networks become the predominant method of Internet access, developer, research, network operator, and regulatory communities have taken an increased interest in measuring end-to-end mobile network performance to, among other goals, minimize negative impact on application responsiveness. In this survey we examine current approaches to end-to-end mobile network performance measurement, diagnosis, and application prototyping. We compare available tools and their shortcomings with respect to the needs of researchers, developers, regulators, and the public. We intend for this survey to provide a comprehensive view of currently active efforts and some auspicious directions for future work in mobile network measurement and mobile application performance evaluation.Comment: Submitted to IEEE Communications Surveys and Tutorials. arXiv does not format the URL references correctly. For a correctly formatted version of this paper go to http://www.cs.montana.edu/mwittie/publications/Goel14Survey.pd

    The NeuViz Data Visualization Tool for Visualizing Internet-Measurements Data

    Get PDF
    In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests

    CLOSER: A Collaborative Locality-aware Overlay SERvice

    Get PDF
    Current Peer-to-Peer (P2P) file sharing systems make use of a considerable percentage of Internet Service Providers (ISPs) bandwidth. This paper presents the Collaborative Locality-aware Overlay SERvice (CLOSER), an architecture that aims at lessening the usage of expensive international links by exploiting traffic locality (i.e., a resource is downloaded from the inside of the ISP whenever possible). The paper proves the effectiveness of CLOSER by analysis and simulation, also comparing this architecture with existing solutions for traffic locality in P2P systems. While savings on international links can be attractive for ISPs, it is necessary to offer some features that can be of interest for users to favor a wide adoption of the application. For this reason, CLOSER also introduces a privacy module that may arouse the users' interest and encourage them to switch to the new architectur

    Discovering users with similar internet access performance through cluster analysis

    Get PDF
    Users typically subscribe to an Internet access service on the basis of a speciïŹc download speed, but the actual service may differ. Several projects are active collecting internet access performance measurements on a large scale at the end user location. However, less attention has been devoted to analyzing such data and to inform users on the received services. This paper presents MiND, a cluster-based methodology to analyze the characteristics of periodic Internet measurements collected at the end user location. MiND allows to discover (i) groups of users with a similar Internet access behavior and (ii) the (few) users with somehow anomalous service. User measurements over time have been modeled through histograms and then analyzed through a new two-level clustering strategy. MiND has been evaluated on real data collected by Neubot, an open source tool, voluntary installed by users, that periodically collects Internet measurements. Experimental results show that the majority of users can be grouped into homogeneous and cohesive clusters according to the Internet access service that they receive in practice, while a few users receiving anomalous services are correctly identiïŹed as outliers. Both users and ISPs can beneïŹt from such information: users can constantly monitor the ISP offered service, whereas ISPs can quickly identify anomalous behaviors in their offered services and act accordingly
    • 

    corecore