170 research outputs found

    Network aware P2P multimedia streaming: capacity or locality?

    Get PDF
    P2P content providers are motivated to localize traffic within Autonomous Systems and therefore alleviate the tension with ISPs stemming from costly inter-AS traffic generated by geographically distributed P2P users. In this paper, we first present a new three-tier framework to conduct a thorough study on the impact of various capacity aware or locality aware neighbor selection and chunk scheduling strategies. Specifically, we propose a novel hybrid neighbor selection strategy with the flexibility to elect neighbors based on either type of network awareness with different probabilities. We find that network awareness in terms of both capacity and locality potentially degrades system QoS as a whole and that capacity awareness faces effort-based unfairness, but enables contribution-based fairness. Extensive simulations show that hybrid neighbor selection can not only promote traffic locality but lift streaming quality and that the crux of traffic locality promotion is active overlay construction. Based on this observation, we then propose a totally decentralized network awareness protocol, equipped with hybrid neighbor selection. In realistic simulation environments, this protocol can reduce inter-AS traffic from 95% to 38% a locality performance comparable with tracker-side strategies (35%) under the premise of high streaming quality. Our performance evaluation results provide valuable insights for both theoretical study on selfish topologies and real-deployed system design. © 2011 IEEE.published_or_final_versionThe 2011 IEEE International Conference on Peer-to-Peer Computing (P2P 2011), Kyoto, Japan, 31 August-2 September 2011. In Proceedings of P2P, 2011, p. 54-6

    Net Neutrality in Canada and what it means for libraries

    Get PDF
    Net neutrality, the idea that the Internet should be provided to all without discrimination based on content or applications, has been an important policy issue in the last few years. A lack of net neutrality could negatively impact libraries, intellectual freedom, cultural diversity, and the right to privacy. This paper looks at the issues that underline the net neutrality debate and describes how they are shaped by the different actors that are concerned with the future of the Internet. Technological issues, such as traffic shaping by Internet Service Providers, and legal issues in the context of Canada's Telecommunications Act, are also addressed. Finally, the paper reviews the recent CRTC policy on Internet Traffic Management Practices

    Trade & Cap: A Customer-Managed, Market-Based System for Trading Bandwidth Allowances at a Shared Link

    Full text link
    We propose Trade & Cap (T&C), an economics-inspired mechanism that incentivizes users to voluntarily coordinate their consumption of the bandwidth of a shared resource (e.g., a DSLAM link) so as to converge on what they perceive to be an equitable allocation, while ensuring efficient resource utilization. Under T&C, rather than acting as an arbiter, an Internet Service Provider (ISP) acts as an enforcer of what the community of rational users sharing the resource decides is a fair allocation of that resource. Our T&C mechanism proceeds in two phases. In the first, software agents acting on behalf of users engage in a strategic trading game in which each user agent selfishly chooses bandwidth slots to reserve in support of primary, interactive network usage activities. In the second phase, each user is allowed to acquire additional bandwidth slots in support of presumed open-ended need for fluid bandwidth, catering to secondary applications. The acquisition of this fluid bandwidth is subject to the remaining "buying power" of each user and by prevalent "market prices" – both of which are determined by the results of the trading phase and a desirable aggregate cap on link utilization. We present analytical results that establish the underpinnings of our T&C mechanism, including game-theoretic results pertaining to the trading phase, and pricing of fluid bandwidth allocation pertaining to the capping phase. Using real network traces, we present extensive experimental results that demonstrate the benefits of our scheme, which we also show to be practical by highlighting the salient features of an efficient implementation architecture.National Science Foundation (CCF-0820138, CSR-0720604, EFRI-0735974, CNS-0524477, and CNS-0520166); Universidad Pontificia Bolivariana and COLCIENCIAS–Instituto Colombiano para el Desarrollo de la Ciencia y la Tecnología “Francisco Jose ́ de Caldas”

    Regulatory Lessons for Internet Traffic Management from Japan, the European Union, and the United States: Toward Equity, Neutrality and Transparency

    Get PDF
    As network neutrality has been one of the most contentious Internet public policy issues of the past decade, this article provides a comparative overview of events, policies, and legislation surrounding Internet traffic management practises (ITMPs) (e.g., network neutrality) in Japan, the European Union, the United States, and Canada. Using the frame provided by Richard Rose of “hybrid lessons”to create a policy synthesis, the paper details the telecom policy environment, Internet Service provider competition, legislative jurisdiction, remedies for ITMPs, consumer transparency, and adherence to privacy protection in each country. The analysis focuses on Canada’s first significant regulatory effort to address network neutrality, which came during the Canadian Radio-television and Telecommunications Commission 2009 process on Internet traffic management. This paper presents a brief overview of the Canadian regulatory environment and the specific questions which were the subject of the CRTC review. Employing Richard Rose’s methods for comparative public policy analysis, we offer a number of regulatory “lessons” from Japan, the European Union, and the United States based on their experiences with traffic management issues. Applying these lessons to the Canadian context, we make several specific policy recommendations, among them that competition be encouraged within the Internet service provider space, that network management practises be reasonable and limited, and that ISPs provide full disclosure of network management policies and practises

    A principled approach to network neutrality

    Get PDF
    The issue of regulation for mandated network neutrality is currently live in both the United States and the European Union. Traditionally the models applied have been of the command and control or market regulation variety. Both approaches have been extensively criticised and both have suffered setbacks in recent years. This paper suggests it is time to abandon our experiments with traditional business regulation models and move to a principled approach for network neutrality. This principled approach based upon the rights to privacy, expression and freedom to carry on a business identifies the Internet as a public good which requires to be protected from interference if we are to fully realise its democratic potential. The proposed principled, or rights-based, approach to net neutrality would see regulations for network neutrality based in principles of fundamental rights and not business or market regulation principles. We believe this would be a radical new model for network neutrality regulation

    Net Neutrality

    Get PDF
    This book is available as open access through the Bloomsbury Open Access programme and is available on www.bloomsburycollections.com. Chris Marsden maneuvers through the hype articulated by Netwrok Neutrality advocates and opponents. He offers a clear-headed analysis of the high stakes in this debate about the Internet's future, and fearlessly refutes the misinformation and misconceptions that about' Professor Rob Freiden, Penn State University Net Neutrality is a very heated and contested policy principle regarding access for content providers to the Internet end-user, and potential discrimination in that access where the end-user's ISP (or another ISP) blocks that access in part or whole. The suggestion has been that the problem can be resolved by either introducing greater competition, or closely policing conditions for vertically integrated service, such as VOIP. However, that is not the whole story, and ISPs as a whole have incentives to discriminate between content for matters such as network management of spam, to secure and maintain customer experience at current levels, and for economic benefit from new Quality of Service standards. This includes offering a ‘priority lane' on the network for premium content types such as video and voice service. The author considers market developments and policy responses in Europe and the United States, draws conclusions and proposes regulatory recommendations

    Impact of denial of service solutions on network quality of service

    Get PDF
    The Internet has become a universal communication network tool. It has evolved from a platform that supports best-effort traffic to one that now carries different traffic types including those involving continuous media with quality of service (QoS) requirements. As more services are delivered over the Internet, we face increasing risk to their availability given that malicious attacks on those Internet services continue to increase. Several networks have witnessed denial of service (DoS) and distributed denial of service (DDoS) attacks over the past few years which have disrupted QoS of network services, thereby violating the Service Level Agreement (SLA) between the client and the Internet Service Provider (ISP). Hence DoS or DDoS attacks are major threats to network QoS. In this paper we survey techniques and solutions that have been deployed to thwart DoS and DDoS attacks and we evaluate them in terms of their impact on network QoS for Internet services. We also present vulnerabilities that can be exploited for QoS protocols and also affect QoS if exploited. In addition, we also highlight challenges that still need to be addressed to achieve end-to-end QoS with recently proposed DoS/DDoS solutions

    Reducing the Download Time in Stochastic P2P Content Delivery Networks by Improving Peer Selection

    Get PDF
    Peer-to-peer (P2P) applications have become a popular method for obtaining digital content. Recent research has shown that the amount of time spent downloading from a poor performing peer effects the total download duration. Current peer selection strategies attempt to limit the amount of time spent downloading from a poor performing peer, but they do not use both advanced knowledge and service capacity after the connection has been made to aid in peer selection. Advanced knowledge has traditionally been obtained from methods that add additional overhead to the P2P network, such as polling peers for service capacity information, using round trip time techniques to calculate the distance between peers, and by using tracker peers. This work investigated the creation of a new download strategy that replaced the random selection of peers with a method that selects server peers based on historic service capacity and ISP in order to further reduce the amount of time needed to complete a download session. Peer-to-peer (P2P) applications have become a popular method for obtaining digital content. Recent research has shown that the amount of time spent downloading from a poor performing peer effects the total download duration. Current peer selection strategies attempt to limit the amount of time spent downloading from a poor performing peer, but they do not use both advanced knowledge and service capacity after the connection has been made to aid in peer selection. Advanced knowledge has traditionally been obtained from methods that add additional overhead to the P2P network, such as polling peers for service capacity information, using round trip time techniques to calculate the distance between peers, and by using tracker peers. This work investigated the creation of a new download strategy that replaced the random selection of peers with a method that selects server peers based on historic service capacity and ISP in order to further reduce the amount of time needed to complete a download session. The results of this new historic based peer selection strategy have shown that there are benefits in using advanced knowledge to select peers and only replacing the worst performing peers. This new approach showed an average download duration improvement of 16.6% in the single client simulation and an average cross ISP traffic reduction of 55.17% when ISPs were participating in cross ISP throttling. In the multiple clients simulation the new approach showed an average download duration improvement of 53.31% and an average cross ISP traffic reduction of 88.83% when ISPs were participating in cross ISP throttling. This new approach also significantly improved the consistency of the download duration between download sessions allowing for the more accurate prediction of download times
    corecore