14 research outputs found

    The Political Nature of TCP/IP

    Get PDF
    Despite the importance of the Internet in the modern world, many users and even policy makers don’t have a necessary historical or technical grasp of the technology behind it. In the spirit of addressing this issue, this thesis attempts to shed light on the historical, political, and technical context of TCP/IP. TCP/IP is the Internet Protocol Suite, a primary piece of Internet architecture with a well-documented history. After at technical overview, detailing the main function of TCP/IP, I examine aspects of the social and developmental record of this technology using STS theoretical approaches such as Hughesian systems theory, Social Construction of Technology (SCOT), and Langdon Winner’s brand of technological determinism. Key points in TCP/IP evolution, when viewed from an STS perspective, illuminate the varied reasons behind decisions and development of the technology. For example, as detailed in this paper, both technical and political motivations were behind the architectural politics built into TCP/IP in the 1970s, and similar motivations spurred the rejection of OSI protocols by Internet developers two decades later. Armed with resultant contextual understanding of previous TCP/IP developments, a few possible directions (both political and technical) in contemporary and future Internet development are then explored, such as the slow migration to IPv6 and the meaning of network neutrality

    Protocol Layering and Internet Policy

    Get PDF
    An architectural principle known as protocol layering is widely recognized as one of the foundations of the Internet’s success. In addition, some scholars and industry participants have urged using the layers model as a central organizing principle for regulatory policy. Despite its importance as a concept, a comprehensive analysis of protocol layering and its implications for Internet policy has yet to appear in the literature. This Article attempts to correct this omission. It begins with a detailed description of the way the five-layer model developed, introducing protocol layering’s central features, such as the division of functions across layers, information hiding, peer communication, and encapsulation. It then discusses the model’s implications for whether particular functions are performed at the edge or in the core of the network, contrasts the model with the way that layering has been depicted in the legal commentary, and analyzes attempts to use layering as a basis for competition policy. Next the Article identifies certain emerging features of the Internet that are placing pressure on the layered model, including WiFi routers, network-based security, modern routing protocols, and wireless broadband. These developments illustrate how every architecture inevitably limits functionality as well as the architecture’s ability to evolve over time in response to changes in the technological and economic environment. Together these considerations support adopting a more dynamic perspective on layering and caution against using layers as a basis for a regulatory mandate for fear of cementing the existing technology into place in a way that prevents the network from innovating and evolving in response to shifts in the underlying technology and consumer demand

    Protocol Layering and Internet Policy

    Get PDF
    An architectural principle known as protocol layering is widely recognized as one of the foundations of the Internet’s success. In addition, some scholars and industry participants have urged using the layers model as a central organizing principle for regulatory policy. Despite its importance as a concept, a comprehensive analysis of protocol layering and its implications for Internet policy has yet to appear in the literature. This Article attempts to correct this omission. It begins with a detailed description of the way the five-layer model developed, introducing protocol layering’s central features, such as the division of functions across layers, information hiding, peer communication, and encapsulation. It then discusses the model’s implications for whether particular functions are performed at the edge or in the core of the network, contrasts the model with the way that layering has been depicted in the legal commentary, and analyzes attempts to use layering as a basis for competition policy. Next the Article identifies certain emerging features of the Internet that are placing pressure on the layered model, including WiFi routers, network-based security, modern routing protocols, and wireless broadband. These developments illustrate how every architecture inevitably limits functionality as well as the architecture’s ability to evolve over time in response to changes in the technological and economic environment. Together these considerations support adopting a more dynamic perspective on layering and caution against using layers as a basis for a regulatory mandate for fear of cementing the existing technology into place in a way that prevents the network from innovating and evolving in response to shifts in the underlying technology and consumer demand

    Protocol Layering and Internet Policy

    Get PDF

    Modularity Theory and Internet Regulation

    Get PDF
    Modularity is often cited as one of the foundations for the Internet’s success. Unfortunately, academic discussions about modularity appearing in the literature on Internet policy are undertheorized. The persistence of nonmodular architectures for some technologies underscores the need for some theoretical basis for determining when modularity is the preferred approach. Even when modularity is desirable, theory must provide some basis for making key design decisions, such as the number of modules, the location of the interfaces between the modules, and the information included in those interfaces. The literature on innovation indicates that modules should be determined by the nature of task interdependencies and the variety inherent in the external environment. Moreover, modularity designs interfaces to ensure that modules operate independently, with all information about processes that adjacent modules should not take into account being hidden within the module. These insights in turn offer a number of important implications. They mark a return to a more technological vision of vertical integration that deviates from the transaction-cost oriented vision that now dominates the literature. They also reveal how modularity necessarily limits the functionality of any particular architecture. In addition, although the independence fostered by modularity remains one of its primary virtues, it can also create coordination problems in which actors operating within each module optimize based on local conditions in ways that can lead to suboptimal outcomes for the system as a whole. Lastly, like any design hierarchy, modular systems can resist technological change. These insights shed new light on unbundling of telecommunications networks, network neutrality, calls for open APIs, and clean-slate redesign proposals

    Modularity Theory and Internet Regulation

    Get PDF
    Modularity is often cited as one of the foundations for the Internet’s success. Unfortunately, academic discussions about modularity appearing in the literature on Internet policy are undertheorized. The persistence of nonmodular architectures for some technologies underscores the need for some theoretical basis for determining when modularity is the preferred approach. Even when modularity is desirable, theory must provide some basis for making key design decisions, such as the number of modules, the location of the interfaces between the modules, and the information included in those interfaces. The literature on innovation indicates that modules should be determined by the nature of task interdependencies and the variety inherent in the external environment. Moreover, modularity designs interfaces to ensure that modules operate independently, with all information about processes that adjacent modules should not take into account being hidden within the module. These insights in turn offer a number of important implications. They mark a return to a more technological vision of vertical integration that deviates from the transaction-cost oriented vision that now dominates the literature. They also reveal how modularity necessarily limits the functionality of any particular architecture. In addition, although the independence fostered by modularity remains one of its primary virtues, it can also create coordination problems in which actors operating within each module optimize based on local conditions in ways that can lead to suboptimal outcomes for the system as a whole. Lastly, like any design hierarchy, modular systems can resist technological change. These insights shed new light on unbundling of telecommunications networks, network neutrality, calls for open APIs, and clean-slate redesign proposals

    Cross-layer energy optimisation of routing protocols in wireless sensor networks

    Get PDF
    Recent technological developments in embedded systems have led to the emergence of a new class of networks, known asWireless Sensor Networks (WSNs), where individual nodes cooperate wirelessly with each other with the goal of sensing and interacting with the environment.Many routing protocols have been developed tomeet the unique and challenging characteristics of WSNs (notably very limited power resources to sustain an expected lifetime of perhaps years, and the restricted computation, storage and communication capabilities of nodes that are nonetheless required to support large networks and diverse applications). No standards for routing have been developed yet for WSNs, nor has any protocol gained a dominant position among the research community. Routing has a significant influence on the overall WSN lifetime, and providing an energy efficient routing protocol remains an open problem. This thesis addresses the issue of designing WSN routing methods that feature energy efficiency. A common time reference across nodes is required in mostWSN applications. It is needed, for example, to time-stamp sensor samples and for duty cycling of nodes. Alsomany routing protocols require that nodes communicate according to some predefined schedule. However, independent distribution of the time information, without considering the routing algorithm schedule or network topology may lead to a failure of the synchronisation protocol. This was confirmed empirically, and was shown to result in loss of connectivity. This can be avoided by integrating the synchronisation service into the network layer with a so-called cross-layer approach. This approach introduces interactions between the layers of a conventional layered network stack, so that the routing layer may share information with other layers. I explore whether energy efficiency can be enhanced through the use of cross-layer optimisations and present three novel cross-layer routing algorithms. The first protocol, designed for hierarchical, cluster based networks and called CLEAR (Cross Layer Efficient Architecture for Routing), uses the routing algorithm to distribute time information which can be used for efficient duty cycling of nodes. The second method - called RISS (Routing Integrated Synchronization Service) - integrates time synchronization into the network layer and is designed to work well in flat, non-hierarchical network topologies. The third method - called SCALE (Smart Clustering Adapted LEACH) - addresses the influence of the intra-cluster topology on the energy dissipation of nodes. I also investigate the impact of the hop distance on network lifetime and propose a method of determining the optimal location of the relay node (the node through which data is routed in a two-hop network). I also address the problem of predicting the transition region (the zone separating the region where all packets can be received and that where no data can be received) and I describe a way of preventing the forwarding of packets through relays belonging in this transition region. I implemented and tested the performance of these solutions in simulations and also deployed these routing techniques on sensor nodes using TinyOS. I compared the average power consumption of the nodes and the precision of time synchronization with the corresponding parameters of a number of existing algorithms. All proposed schemes extend the network lifetime and due to their lightweight architecture they are very efficient on WSN nodes with constrained resources. Hence it is recommended that a cross-layer approach should be a feature of any routing algorithm for WSNs

    Recuperação de informação baseada em frases para textos biomédicos

    Get PDF
    Mestrado em Engenharia de Computadores e TelemáticaO desenvolvimento de novos métodos experimentais e tecnologias de alto rendimento no campo biomédico despoletou um crescimento acelerado do volume de publicações científicas na área. Inúmeros repositórios estruturados para dados biológicos foram criados ao longo das últimas décadas, no entanto, os utilizadores estão cada vez mais a recorrer a sistemas de recuperação de informação, ou motores de busca, em detrimento dos primeiros. Motores de pesquisa apresentam-se mais fáceis de usar devido à sua flexibilidade e capacidade de interpretar os requisitos dos utilizadores, tipicamente expressos na forma de pesquisas compostas por algumas palavras. Sistemas de pesquisa tradicionais devolvem documentos completos, que geralmente requerem um grande esforço de leitura para encontrar a informação procurada, encontrando-se esta, em grande parte dos casos, descrita num trecho de texto composto por poucas frases. Além disso, estes sistemas falham frequentemente na tentativa de encontrar a informação pretendida porque, apesar de a pesquisa efectuada estar normalmente alinhada semanticamente com a linguagem usada nos documentos procurados, os termos usados são lexicalmente diferentes. Esta dissertação foca-se no desenvolvimento de técnicas de recuperação de informação baseadas em frases que, para uma dada pesquisa de um utilizador, permitam encontrar frases relevantes da literatura científica que respondam aos requisitos do utilizador. O trabalho desenvolvido apresenta-se em duas partes. Primeiro foi realizado trabalho de investigação exploratória para identificação de características de frases informativas em textos biomédicos. Para este propósito foi usado um método de aprendizagem automática. De seguida foi desenvolvido um sistema de pesquisa de frases informativas. Este sistema suporta pesquisas de texto livre e baseadas em conceitos, os resultados de pesquisa apresentam-se enriquecidos com anotações de conceitos relevantes e podem ser ordenados segundo várias estratégias de classificação.Modern advances of experimental methods and high-throughput technology in the biomedical domain are causing a fast-paced, rising growth of the volume of published scientific literature in the field. While a myriad of structured data repositories for biological knowledge have been sprouting over the last decades, Information Retrieval (IR) systems are increasingly replacing them. IR systems are easier to use due to their flexibility and ability to interpret user needs in the form of queries, typically formed by a few words. Traditional document retrieval systems return entire documents, which may require a lot of subsequent reading to find the specific information sought, frequently contained in a small passage of only a few sentences. Additionally, IR often fails to find what is wanted because the words used in the query are lexically different, despite semantically aligned, from the words used in relevant sources. This thesis focuses on the development of sentence-based information retrieval approaches that, for a given user query, allow seeking relevant sentences from scientific literature that answer the user information need. The presented work is two-fold. First, exploratory research experiments were conducted for the identification of features of informative sentences from biomedical texts. A supervised machine learning method was used for this purpose. Second, an information retrieval system for informative sentences was developed. It supports free text and concept-based queries, search results are enriched with relevant concept annotations and sentences can be ranked using multiple configurable strategies
    corecore