249 research outputs found

    Protocol Layering and Internet Policy

    Get PDF
    An architectural principle known as protocol layering is widely recognized as one of the foundations of the Internet’s success. In addition, some scholars and industry participants have urged using the layers model as a central organizing principle for regulatory policy. Despite its importance as a concept, a comprehensive analysis of protocol layering and its implications for Internet policy has yet to appear in the literature. This Article attempts to correct this omission. It begins with a detailed description of the way the five-layer model developed, introducing protocol layering’s central features, such as the division of functions across layers, information hiding, peer communication, and encapsulation. It then discusses the model’s implications for whether particular functions are performed at the edge or in the core of the network, contrasts the model with the way that layering has been depicted in the legal commentary, and analyzes attempts to use layering as a basis for competition policy. Next the Article identifies certain emerging features of the Internet that are placing pressure on the layered model, including WiFi routers, network-based security, modern routing protocols, and wireless broadband. These developments illustrate how every architecture inevitably limits functionality as well as the architecture’s ability to evolve over time in response to changes in the technological and economic environment. Together these considerations support adopting a more dynamic perspective on layering and caution against using layers as a basis for a regulatory mandate for fear of cementing the existing technology into place in a way that prevents the network from innovating and evolving in response to shifts in the underlying technology and consumer demand

    The role of non-state actors in regime formation: Case study on Internet governance.

    Get PDF
    Many scholars argue that the Internet is a symbol of globalization and avoidance of state control. The Internet governance negotiations, which aims to establish an international regime for the Internet, is conducted through a multi-stakeholder setting associated with extensive involvement of non-state actors. This has been viewed as an indicator for a \u27diminishing state role\u27 in international relations; particularly, formation of international regimes. This study indicates that the role of states does not diminish in regime formation. States, especially great powers, are the main actors that set international principles, norms, rules and decision-making procedures. They create regimes in order to regulate international behavior as to global sectors, including the Internet. States deliberately enable certain non-state actors to participate in regime formation and governance of some global sectors, based on conscious perception of the utility and usefulness of such participation

    Virtualization to build large scale networks

    Get PDF
    Abstract. There is not much research concerning network virtualization, even though virtualization has been a hot topic for some time and networks keep growing. Physical routers can be expensive and laborious to setup and manage, not to mention immobile. Network virtualization can be utilized in many ways, such as reducing costs, increasing agility and increasing deployment speed. Virtual routers are easy to create, copy and move. This study will research into the subjects of networks, virtualization solutions and network virtualization. Furthermore, it will show how to build a virtual network consisting of hundreds of nodes, all performing network routing. In addition, the virtual network can be connected to physical routers in the real world to provide benefits, such as performance testing or large-scale deployment. All this will be achieved using only commodity hardware

    Leaders for Manufacturing Program electronic mail network

    Get PDF
    Thesis (B.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1989.Includes bibliographical references.by James Chung-Yo Yao.B.S

    Bandwidth is political: Reachability in the public internet

    Get PDF
    The global public Internet faces a growing but little studied threat from the use of intrusive traffic management practices by both wholesale and retail Internet service providers. Unlike research concerned with bandwidth and traffic growth, this study shifts the risk analysis away from capacity issues to focus on performance standards for interconnection and data reachability. The long-term health of the Internet is framed in terms of “data reachability” – the principle that any end-user can reach any part of the Internet without encountering arbitrary actions on the part of a network operator that might block or degrade transmission. Risks to reachability are framed in terms of both systematic traffic management practices and “de-peering,” a more aggressive tactic practised by Tier-1 network operators to resolve disputes or punish rivals. De-peering is examined as an extension of retail network management practices that include the growing use of deep packet inspection (DPI) technology for traffic-shaping. De-peering can also be viewed as a close relative of Net Neutrality, to the extent that both concepts reflect arbitrary practices that interfere with the reliable flow of data packets across the Internet. In jurisdictional terms, however, de-peering poses a qualitatively different set of risks to stakeholders and end-users, as well as qualitatively different challenges to policymakers. It is argued here that risks to data unreachability represent the next stage in debates about the health and sustainability of the global Internet. The study includes a detailed examination of the development of the Internet’s enabling technologies; the evolution of telecommunications regulation in Canada and the United States, and its impact on Internet governance; and an analysis of the role played by commercialization and privatization in the growth of risks to data reachability
    corecore