307 research outputs found
An Economic Analysis of Domain Name Policy
One of the most important features of the architecture of the Internet is the Domain Name System (DNS), which is administered by the Internet Corporation for Assigned Names and Numbers (ICANN). Logically, the DNS is organized into Top Level Domains (such as .com), Second Level Domains (such as amazon.com), and third, fourth, and higher level domains (such as www.amazon.com). The physically infrastructure of the DNS consists of name servers, including the Root Server System which provides the information that directs name queries for each Top Level Domain to the appropriate server. ICANN is responsible for the allocation of the root and the creation or reallocation of Top Level Domains.
The Root Server System and associated name space are scarce resources in the economic sense. The root servers have a finite capacity and expansion of the system is costly. The name space is scarce, because each string (or set of characters) can only be allocated to one Registry (or operator of a Top Level Domain). In addition, name service is not a public good in the economic sense, because it is possible to exclude strings from the DNS and because the allocation of a string to one firm results in the inability of other firms to use that name string. From the economic perspective, therefore, the question arises: what is the most efficient method for allocating the root resource?
There are only five basic options available for allocation of the root. (1) a static root, equivalent to a decision to waste the currently unallocated capacity; (2) public interest hearings (or beauty contests); (3) lotteries; (4) a queuing mechanism; or (5) an auction. The fundamental economic question about the Domain Name System is which of these provides the most efficient mechanism for allocating the root resource?
This resource allocation problem is analogous to problems raised in the telecommunications sector, where the Federal Communications Commission has a long history of attempting to allocate broadcast spectrum and the telephone number space. This experience reveals that a case-by-case allocation on the basis of ad hoc judgments about the public interest is doomed to failure, and that auctions (as opposed to lotteries or queues) provide the best mechanism for insuring that such public-trust resources find their highest and best use.
Based on the telecommunications experience, the best method for ICANN to allocate new Top Level Domains would be to conduct an auction. Many auction designs are possible. One proposal is to auction a fixed number of new Top Level Domain slots each year. This proposal would both expand the root resource at a reasonable pace and insure that the slots went to their highest and best use. Public interest Top Level Domains could be allocated by another mechanism such as a lottery and their costs to ICANN could be subsidized by the proceeds of the auction
The composition and role of convergent technological repertoires in audiovisual media consumption
This mixed-method research focuses on the growing appropriation of multiple screen devices for audiovisual media consumption. Based on survey measures, we distinguish three patterns: (a) maintaining the status quo, by mainly drawing upon television, (b) broadening up the repertoire, by extending television with computers and mobile devices, or (c) even replacing television by a computer. Next, we draw upon insights from niche theory, rationalising media choices in terms of competing gratifications. This perspective is however too one-sided, as our results indicate that habit is a much stronger explanatory variable, especially when a broad range of devices are appropriated. In a follow-up qualitative study, based on Q-methodology, we found that the orientations towards what people seek in audiovisual technologies are only mildly contingent with specific technology appropriation. This problematises the very substance of niches in the audiovisual: as technologies are capable of the same benefits, their discriminating power is declining. Hence, in future applications of niche theory, gratifications and habits of communication modes (what people do with media technologies) should be taken into account, rather than media as tied to a specific technology. Niche theory's core remains, but its applications should be updated to theoretical insights matching the evolving media environment
Recommended from our members
“Our Relationship? It’s the Odd Mucky Weekend, Not a One Night Stand”: Journalists and aid agencies in the UK, and the current challenges to sourcing in humanitarian disasters
In humanitarian crises, the sources that journalists employ have always helped determine which stories achieve a high media profile, as well as play a part in framing the story. In particular, aid agencies acted as powerful gatekeepers to disaster zones, providing flights, transport, fixers and translators to journalists–and more recently, text, images and resources for the social web. Questions have been raised around transparency and objectivity in such reporting as a result. This paper draws on 40 semi-structured qualitative interviews with UK national journalists (broadcast, print and online) and aid agencies belonging to the UK's Disasters Emergency Committee. As a result, this paper builds on journalism studies looking at boundary (re)negotiations in journalism and the source-media relationship to show the current patterns in what has been described as a “mutually exploitative” relationship. It compares and contrasts what assistance journalists say they accept from aid agencies and what aid agencies report. It examines how both sides are often unwilling to acknowledge the close association. It will also look at how the increasing professionalisation of NGO operations including the employment of former journalists and producing their own content may be affecting the power dynamics. Finally, it asks whether the slow emergence of scandals means this relationship has not only affected stories that are covered but those that are not
Female Perpetrators in Internal Child Trafficking in China: An empirical study
Through an empirical study, this article explores the overall profile of female traffickers of children in China and their role and performance in the trafficking processes. Its contribution to the human-trafficking literature lies in its focus on female perpetrators in particular. The article provides an overview of the international literature on female traffickers as well as contemporary knowledge about internal child trafficking in China. Empirical data from incarcerated traffickers suggest that portraying female traffickers as active players of criminal networks obscures the structural problems affecting female child traffickers. The short-term result is that the problems of female offenders are ignored, and the long-term impact is policy making that is disconnected from the lived experiences of an important population. From a gender perspective, this study suggests that female child traffickers are offenders as well as victims of social and gender inequalities in China’s reform era. This study also proposes that internal child trafficking in China should be brought in the international and Anglo-American debates surrounding human trafficking
Friends and Enemies Within: The Roles of Subgroups, Imbalance, and Isolates in Geographically Dispersed Teams
Research regarding geographically dispersed teams (GDTs) is increasingly common and has
yielded many insights into how spatio-temporal and socio-demographic factors affect GDT functioning
and performance. Largely missing, however, is research on the effects of the basic geographic
configuration of GDTs. In this study, we explore the impact of GDT configuration (i.e., the relative
number of team members at different sites, independent of the characteristics of those members or the
spatial and temporal distances among them) on GDT dynamics. In a quasi-experimental setting, we
examine the effects of configuration using a sample of 62 six-person teams in four different one- and twosite
configurations. As predicted, we find that configuration significantly affects team dynamics –
independent of spatio-temporal distance and socio-demographic factors. More specifically, we find that
teams with geographically-based subgroups (defined as two or more members per site) have significantly
less shared team identity, less effective transactive memory, more conflict, and more coordination issues.
Furthermore, in teams with subgroups, imbalance (i.e., the uneven distribution of members across sites)
exacerbates these effects; subgroups with a numerical minority of team members report significantly
poorer scores on the same four outcomes. In contrast, teams with geographically isolated members (i.e.,
members who have no teammates at their site) outperform both balanced and imbalanced configurations
Long term records of erosional change from marine ferromanganese crusts
Ferromanganese crusts from the Atlantic, Indian and Pacific Oceans record the Nd and Pb isotope compositions of the water masses from which they form as hydrogenous precipitates. The10Be/9Be-calibrated time series for crusts are compared to estimates based on Co-contents, from which the equatorial Pacific crusts studied are inferred to have recorded ca. 60 Ma of Pacific deep water history. Time series of ɛNd show that the oceans have maintained a strong provinciality in Nd isotopic composition, determined by terrigenous inputs, over periods of up to 60 Ma. Superimposed on the distinct basin-specific signatures are variations in Nd and Pb isotope time series which have been particularly marked over the last 5 Ma.
It is shown that changes in erosional inputs, particularly associated with Himalayan uplift and the northern hemisphere glaciation have influenced Indian and Atlantic Ocean deep water isotopic compositions respectively. There is no evidence so far for an imprint of the final closure of the Panama Isthmus on the Pb and Nd isotopic composition in either Atlantic or Pacific deep water masses
Meiosis in Mice without a Synaptonemal Complex
The synaptonemal complex (SC) promotes fusion of the homologous chromosomes (synapsis) and crossover recombination events during meiosis. The SC displays an extensive structural conservation between species; however, a few organisms lack SC and execute meiotic process in a SC-independent manner. To clarify the SC function in mammals, we have generated a mutant mouse strain (Sycp1−/−Sycp3−/−, here called SC-null) in which all known SC proteins have been displaced from meiotic chromosomes. While transmission electron microscopy failed to identify any remnants of the SC in SC-null spermatocytes, neither formation of the cohesion axes nor attachment of the chromosomes to the nuclear membrane was perturbed. Furthermore, the meiotic chromosomes in SC-null meiocytes achieved pre-synaptic pairing, underwent early homologous recombination events and sustained a residual crossover formation. In contrast, in SC-null meiocytes synapsis and MLH1-MLH3-dependent crossovers maturation were abolished, whereas the structural integrity of chromosomes was drastically impaired. The variable consequences that SC inactivation has on the meiotic process in different organisms, together with the absence of SC in some unrelated species, imply that the SC could have originated independently in different taxonomic groups
Correction for Johansson et al., An open challenge to advance probabilistic forecasting for dengue epidemics.
Correction for “An open challenge to advance probabilistic forecasting for dengue epidemics,” by Michael A. Johansson, Karyn M. Apfeldorf, Scott Dobson, Jason Devita, Anna L. Buczak, Benjamin Baugher, Linda J. Moniz, Thomas Bagley, Steven M. Babin, Erhan Guven, Teresa K. Yamana, Jeffrey Shaman, Terry Moschou, Nick Lothian, Aaron Lane, Grant Osborne, Gao Jiang, Logan C. Brooks, David C. Farrow, Sangwon Hyun, Ryan J. Tibshirani, Roni Rosenfeld, Justin Lessler, Nicholas G. Reich, Derek A. T. Cummings, Stephen A. Lauer, Sean M. Moore, Hannah E. Clapham, Rachel Lowe, Trevor C. Bailey, Markel García-Díez, Marilia Sá Carvalho, Xavier Rodó, Tridip Sardar, Richard Paul, Evan L. Ray, Krzysztof Sakrejda, Alexandria C. Brown, Xi Meng, Osonde Osoba, Raffaele Vardavas, David Manheim, Melinda Moore, Dhananjai M. Rao, Travis C. Porco, Sarah Ackley, Fengchen Liu, Lee Worden, Matteo Convertino, Yang Liu, Abraham Reddy, Eloy Ortiz, Jorge Rivero, Humberto Brito, Alicia Juarrero, Leah R. Johnson, Robert B. Gramacy, Jeremy M. Cohen, Erin A. Mordecai, Courtney C. Murdock, Jason R. Rohr, Sadie J. Ryan, Anna M. Stewart-Ibarra, Daniel P. Weikel, Antarpreet Jutla, Rakibul Khan, Marissa Poultney, Rita R. Colwell, Brenda Rivera-García, Christopher M. Barker, Jesse E. Bell, Matthew Biggerstaff, David Swerdlow, Luis Mier-y-Teran-Romero, Brett M. Forshey, Juli Trtanj, Jason Asher, Matt Clay, Harold S. Margolis, Andrew M. Hebbeler, Dylan George, and Jean-Paul Chretien, which was first published November 11, 2019; 10.1073/pnas.1909865116. The authors note that the affiliation for Xavier Rodó should instead appear as Catalan Institution for Research and Advanced Studies (ICREA) and Climate and Health Program, Barcelona Institute for Global Health (ISGlobal). The corrected author and affiliation lines appear below. The online version has been corrected
Pathway-Based Analysis of a Melanoma Genome-Wide Association Study: Analysis of Genes Related to Tumour-Immunosuppression
Systemic immunosuppression is a risk factor for melanoma, and sunburn-induced immunosuppression is thought to be causal. Genes in immunosuppression pathways are therefore candidate melanoma-susceptibility genes. If variants within these genes individually have a small effect on disease risk, the association may be undetected in genome-wide association (GWA) studies due to low power to reach a high significance level. Pathway-based approaches have been suggested as a method of incorporating a priori knowledge into the analysis of GWA studies. In this study, the association of 1113 single nucleotide polymorphisms (SNPs) in 43 genes (39 genomic regions) related to immunosuppression have been analysed using a gene-set approach in 1539 melanoma cases and 3917 controls from the GenoMEL consortium GWA study. The association between melanoma susceptibility and the whole set of tumour-immunosuppression genes, and also predefined functional subgroups of genes, was considered. The analysis was based on a measure formed by summing the evidence from the most significant SNP in each gene, and significance was evaluated empirically by case-control label permutation. An association was found between melanoma and the complete set of genes (pemp = 0.002), as well as the subgroups related to the generation of tolerogenic dendritic cells (pemp = 0.006) and secretion of suppressive factors (pemp = 0.0004), thus providing preliminary evidence of involvement of tumour-immunosuppression gene polymorphisms in melanoma susceptibility. The analysis was repeated on a second phase of the GenoMEL study, which showed no evidence of an association. As one of the first attempts to replicate a pathway-level association, our results suggest that low power and heterogeneity may present challenges
- …