28 research outputs found

    The Origins of ccTLD Policymaking

    Get PDF
    Extract: A long time ago in a galaxy not so far away, there was a decentralized global network of computers. These computers shared information with each other regardless of how far apart they were and whether there was any direct line of communication between them. In the very beginning, this network was used exclusively by government and military agencies, educational and research institutions, government contractors, scientists, and technology specialists. Instead of the domain names we use today, such as “www. amazon.com,” users typed in numeric addresses, such as “123.45.67.89,” and, later, host names to send information to other computers. This network soon expanded, and domain names became a practical necessity. There are at least two reasons. First, alphanumeric texts are generally easier for humans to remember than numeric addresses. Second, as Internet traffic increases and computer systems are reconfigured, the computer server used for a particular Web site may change from time to time. In fact, some busy Web sites might use multiple servers, requiring them to take turns to address requests directed to a single domain name. While the Web site owner (or his or her technical staff) might know internally to which numeric address the Web site corresponds at a particular moment, the general public does not. Domain names are therefore needed for identification purposes

    SOVEREIGN DOMAINS: A Declaration of Independence of ccTLDs from Foreign Control

    Get PDF
    In the year 2000, the Government Advisory Committee (“GAC”) of the Internet Corporation for Assigned Names and Numbers (“ICANN”) passed a set of principles that essentially claimed national sovereignty over country code top-level domains (“ccTLD”s) such as .us, .ca, .uk and .au. Shortly thereafter, ICANN redelegated several ccTLDs in accordance with new GAC principles. Despite the outcry accompanying the passage of these principles and ICANN’s self-professed adherence thereto, the entire exercise could easily be criticized as merely symbolic because of the overriding power of ICANN in the operation of the Domain Name System (“DNS”). Indeed, Stuart Lynn, ICANN’s current president, summed up the lack of power that ccTLDs have within the governance structure of the Internet when he opined that “ICANN could, in theory, recommend that a particular ccTLD be redelegated to a cooperating administrator. If the United States government accepted that recommendation, non-cooperating ccTLD administrators would be replaced.

    The role of non-state actors in regime formation: Case study on Internet governance.

    Get PDF
    Many scholars argue that the Internet is a symbol of globalization and avoidance of state control. The Internet governance negotiations, which aims to establish an international regime for the Internet, is conducted through a multi-stakeholder setting associated with extensive involvement of non-state actors. This has been viewed as an indicator for a \u27diminishing state role\u27 in international relations; particularly, formation of international regimes. This study indicates that the role of states does not diminish in regime formation. States, especially great powers, are the main actors that set international principles, norms, rules and decision-making procedures. They create regimes in order to regulate international behavior as to global sectors, including the Internet. States deliberately enable certain non-state actors to participate in regime formation and governance of some global sectors, based on conscious perception of the utility and usefulness of such participation

    Antitrust analysis of dominance in the Internet governance

    Get PDF
    Paolo Piacentini, Claudio De Vincenti, Giuseppe De Arcangeli

    Internet Governance in the Global South: History, Theory, and Contemporary Debates

    Get PDF

    Negotiating Internet Governance

    Get PDF
    What is at stake for how the Internet continues to evolve is the preservation of its integrity as a single network. In practice, its governance is neither centralised nor unitary; it is piecemeal and fragmented, with authoritative decision-making coming from different sources simultaneously: governments, businesses, international organisations, technical and academic experts, and civil society. Historically, the conditions for their interaction were rarely defined beyond basic technical coordination, due at first to the academic freedom granted to the researchers developing the network and, later on, to the sheer impossibility of controlling mushrooming Internet initiatives. Today, the search for global norms and rules for the Internet continues, be it for cybersecurity or artificial intelligence, amid processes fostering the supremacy of national approaches or the vitality of a pluralist environment with various stakeholders represented. This book provides an incisive analysis of the emergence and evolution of global Internet governance, unpacking the complexity of more than 300 governance arrangements, influential debates and political negotiations over four decades. Highly accessible, this book breaks new ground through a wide empirical exploration and a new conceptual approach to governance enactment in global issue domains. A tripartite framework is employed for revealing power dynamics, relying on: a) an extensive database of mechanisms of governance for the Internet at the global and regional level; b) an in-depth analysis of the evolution of actors and priorities over time; and c) a key set of dominant practices observed in the Internet governance communities. It explains continuity and change in Internet-related negotiations, opening up new directions for thinking and acting in this field

    The Managing Lawmaker in Cyberspace: A Power Model

    Get PDF

    A framework for malicious host fingerprinting using distributed network sensors

    Get PDF
    Numerous software agents exist and are responsible for increasing volumes of malicious traffic that is observed on the Internet today. From a technical perspective the existing techniques for monitoring malicious agents and traffic were not developed to allow for the interrogation of the source of malicious traffic. This interrogation or reconnaissance would be considered active analysis as opposed to existing, mostly passive analysis. Unlike passive analysis, the active techniques are time-sensitive and their results become increasingly inaccurate as time delta between observation and interrogation increases. In addition to this, some studies had shown that the geographic separation of hosts on the Internet have resulted in pockets of different malicious agents and traffic targeting victims. As such it would be important to perform any kind of data collection over various source and in distributed IP address space. The data gathering and exposure capabilities of sensors such as honeypots and network telescopes were extended through the development of near-realtime Distributed Sensor Network modules that allowed for the near-realtime analysis of malicious traffic from distributed, heterogeneous monitoring sensors. In order to utilise the data exposed by the near-realtime Distributed Sensor Network modules an Automated Reconnaissance Framework was created, this framework was tasked with active and passive information collection and analysis of data in near-realtime and was designed from an adapted Multi Sensor Data Fusion model. The hypothesis was made that if sufficiently different characteristics of a host could be identified; combined they could act as a unique fingerprint for that host, potentially allowing for the re-identification of that host, even if its IP address had changed. To this end the concept of Latency Based Multilateration was introduced, acting as an additional metric for remote host fingerprinting. The vast amount of information gathered by the AR-Framework required the development of visualisation tools which could illustrate this data in near-realtime and also provided various degrees of interaction to accommodate human interpretation of such data. Ultimately the data collected through the application of the near-realtime Distributed Sensor Network and AR-Framework provided a unique perspective of a malicious host demographic. Allowing for new correlations to be drawn between attributes such as common open ports and operating systems, location, and inferred intent of these malicious hosts. The result of which expands our current understanding of malicious hosts on the Internet and enables further research in the area
    corecore