28 research outputs found
The Origins of ccTLD Policymaking
Extract:
A long time ago in a galaxy not so far away, there was a decentralized global network of computers. These computers shared information with each other regardless of how far apart they were and whether there was any direct line of communication between them. In the very beginning, this network was used exclusively by government and military agencies, educational and research institutions, government contractors, scientists, and technology specialists. Instead of the domain names we use today, such as âwww. amazon.com,â users typed in numeric addresses, such as â123.45.67.89,â and, later, host names to send information to other computers.
This network soon expanded, and domain names became a practical necessity. There are at least two reasons. First, alphanumeric texts are generally easier for humans to remember than numeric addresses. Second, as Internet traffic increases and computer systems are reconfigured, the computer server used for a particular Web site may change from time to time. In fact, some busy Web sites might use multiple servers, requiring them to take turns to address requests directed to a single domain name. While the Web site owner (or his or her technical staff) might know internally to which numeric address the Web site corresponds at a particular moment, the general public does not. Domain names are therefore needed for identification purposes
Recommended from our members
Governing Internet Territory: ICANN, Sovereignty Claims, Property Rights and Country Code Top-level Domains
This paper examines the legal and Internet governance controversies over country code top-level domain names (ccTLDs). In recent litigation (Weinstein v. Islamic Republic of Iran and ICANN), terrorism victims argued that ccTLDs are property and attempted to seize Iranâs .IR domain for compensation. In refusing to uphold this claim, an appeals court ruled that a court-ordered redelegation would impair the Internet Corporation for Assigned Names and Numbersâ (ICANNâs) role in global Internet governance. While the .IR case is recent, the underlying tensions between state sovereignty, the role of ICANN and the rights of organizations that have been awarded ccTLDs have been simmering for two decades. Three governance models are in play: a sovereignty-based model, a property rights/market-based model, and a global public trustee model. The legal and political science literature leaves this Internet governance issue unexplored and unsettled, while court rulings on the property status of domains have been mixed or indecisive. Most legal scholars merely assume that states have sovereignty rights over their ccTLDs and do not critically assess the justification for, or the implications of, a sovereignty-based model. Likewise, many legal scholars, governments and Internet governance institutions have resisted recognizing TLD delegations as a property right, but their arguments are often based on misunderstandings of the economics and technology of the domain name system. Drawing on law, economics and sovereignty theories, this paper shows that top-level domain names have all the essential features of a property right. It argues that a governance regime that recognized them as such would be preferable to a regime based on sovereignty claims or a global public trustee model.
SOVEREIGN DOMAINS: A Declaration of Independence of ccTLDs from Foreign Control
In the year 2000, the Government Advisory Committee (âGACâ) of the Internet Corporation for Assigned Names and Numbers (âICANNâ) passed a set of principles that essentially claimed national sovereignty over country code top-level domains (âccTLDâs) such as .us, .ca, .uk and .au. Shortly thereafter, ICANN redelegated several ccTLDs in accordance with new GAC principles. Despite the outcry accompanying the passage of these principles and ICANNâs self-professed adherence thereto, the entire exercise could easily be criticized as merely symbolic because of the overriding power of ICANN in the operation of the Domain Name System (âDNSâ). Indeed, Stuart Lynn, ICANNâs current president, summed up the lack of power that ccTLDs have within the governance structure of the Internet when he opined that âICANN could, in theory, recommend that a particular ccTLD be redelegated to a cooperating administrator. If the United States government accepted that recommendation, non-cooperating ccTLD administrators would be replaced.
The role of non-state actors in regime formation: Case study on Internet governance.
Many scholars argue that the Internet is a symbol of globalization and avoidance of state control. The Internet governance negotiations, which aims to establish an international regime for the Internet, is conducted through a multi-stakeholder setting associated with extensive involvement of non-state actors. This has been viewed as an indicator for a \u27diminishing state role\u27 in international relations; particularly, formation of international regimes. This study indicates that the role of states does not diminish in regime formation. States, especially great powers, are the main actors that set international principles, norms, rules and decision-making procedures. They create regimes in order to regulate international behavior as to global sectors, including the Internet. States deliberately enable certain non-state actors to participate in regime formation and governance of some global sectors, based on conscious perception of the utility and usefulness of such participation
Antitrust analysis of dominance in the Internet governance
Paolo Piacentini, Claudio De Vincenti, Giuseppe De Arcangeli
Negotiating Internet Governance
What is at stake for how the Internet continues to evolve is the preservation of its integrity as a single network. In practice, its governance is neither centralised nor unitary; it is piecemeal and fragmented, with authoritative decision-making coming from different sources simultaneously: governments, businesses, international organisations, technical and academic experts, and civil society. Historically, the conditions for their interaction were rarely defined beyond basic technical coordination, due at first to the academic freedom granted to the researchers developing the network and, later on, to the sheer impossibility of controlling mushrooming Internet initiatives. Today, the search for global norms and rules for the Internet continues, be it for cybersecurity or artificial intelligence, amid processes fostering the supremacy of national approaches or the vitality of a pluralist environment with various stakeholders represented. This book provides an incisive analysis of the emergence and evolution of global Internet governance, unpacking the complexity of more than 300 governance arrangements, influential debates and political negotiations over four decades.
Highly accessible, this book breaks new ground through a wide empirical exploration and a new conceptual approach to governance enactment in global issue domains. A tripartite framework is employed for revealing power dynamics, relying on: a) an extensive database of mechanisms of governance for the Internet at the global and regional level; b) an in-depth analysis of the evolution of actors and priorities over time; and c) a key set of dominant practices observed in the Internet governance communities. It explains continuity and change in Internet-related negotiations, opening up new directions for thinking and acting in this field
A framework for malicious host fingerprinting using distributed network sensors
Numerous software agents exist and are responsible for increasing volumes of malicious traffic that is observed on the Internet today. From a technical perspective the existing techniques for monitoring malicious agents and traffic were not developed to allow for the interrogation of the source of malicious traffic. This interrogation or reconnaissance would be considered active analysis as opposed to existing, mostly passive analysis. Unlike passive analysis, the active techniques are time-sensitive and their results become increasingly inaccurate as time delta between observation and interrogation increases. In addition to this, some studies had shown that the geographic separation of hosts on the Internet have resulted in pockets of different malicious agents and traffic targeting victims. As such it would be important to perform any kind of data collection over various source and in distributed IP address space. The data gathering and exposure capabilities of sensors such as honeypots and network telescopes were extended through the development of near-realtime Distributed Sensor Network modules that allowed for the near-realtime analysis of malicious traffic from distributed, heterogeneous monitoring sensors. In order to utilise the data exposed by the near-realtime Distributed Sensor Network modules an Automated Reconnaissance Framework was created, this framework was tasked with active and passive information collection and analysis of data in near-realtime and was designed from an adapted Multi Sensor Data Fusion model. The hypothesis was made that if sufficiently different characteristics of a host could be identified; combined they could act as a unique fingerprint for that host, potentially allowing for the re-identification of that host, even if its IP address had changed. To this end the concept of Latency Based Multilateration was introduced, acting as an additional metric for remote host fingerprinting. The vast amount of information gathered by the AR-Framework required the development of visualisation tools which could illustrate this data in near-realtime and also provided various degrees of interaction to accommodate human interpretation of such data. Ultimately the data collected through the application of the near-realtime Distributed Sensor Network and AR-Framework provided a unique perspective of a malicious host demographic. Allowing for new correlations to be drawn between attributes such as common open ports and operating systems, location, and inferred intent of these malicious hosts. The result of which expands our current understanding of malicious hosts on the Internet and enables further research in the area