4,023 research outputs found

    Evaluating Wireless Carrier Consolidation Using Semiparametric Demand Estimation

    Get PDF
    The US mobile phone service industry has dramatically consolidated over the last two decades. One justification for consolidation is that merged firms can provide consumers with larger coverage areas at lower costs. We estimate the willingness to pay for national coverage to evaluate this motivation for past consolidation. As market level quantity data is not publicly available, we devise an econometric procedure that allows us to estimate the willingness to pay using market share ranks collected from a popular online retailer, Amazon. Our semiparametric maximum score estimator controls for consumers%u2019 heterogeneous preferences for carriers, handsets and minutes of calling time. We find that national coverage is strongly valued by consumers, providing an efficiency justification for across-market mergers. The methods we propose can estimate demand for other products using data from Amazon or other online retailers where quantities are not observed, but product ranks are observed. Since Amazon data can easily be gathered by researchers, these methods may be useful for the analysis of other product markets where high quality data are not publicly available.

    The Right of the Physically and Mentally Handicapped: Amendments Necessary to Guarantee Protection Through the Civil Rights Act of 1964

    Get PDF
    SINGLE STROKES of the government\u27s pen can seldom alone accomplish social goals. To insure vitality, legislation requires review, revision and amendment. Though worthy of praise for initial and continuing contributions towards social betterment, the Civil Rights Act of 19641 falls into this classification. Its scope is too narrow because it fails to include a significant group of persons sorely in need of its protection. This legislation needs the depth evoked by its title rather than the limitations of its present language. Amendment is required to protect the rights of the physically and mentally handicapped

    Debris Slides and Flows on Anakeesta Ridge within the Great Smoky Mountains National Park, Tennessee, U.S.A

    Get PDF
    Debris slides and flows along Anakeesta Ridge in the GSMNP have been investigated utilizing dendrochronology, aerial photography, erosion stations, precipitation data and rockslope engineering techniques. Based on the slides for which the specific dates of occurrence are known, the corresponding precipitation that was at least in part responsible for triggering the mass-wasting events varies from 1 .5 inches of rain per day to 4 inches of rain for a 6 hour period. Unfortunately, intensity records are not available to provide at what rate the precipitation was delivered. Based on TV A precipitation records it was determined that 1273 storms (1 or more inches of rain per 24 hour period) occurred during the period of 1951-1987 in the general area of the Great Smoky Mountain National Park. An abundance of moisture is available to effect erosional processes. Transportational processes operating on Anakeesta Ridge include creep, overland flow, and debris sliding. Additional slope modifiers include: needle ice, slaking, and bank slumping. Slope retreat is primarily accomplished through sheet wash, mass movement and tree throw. Appreciable amounts of fine sediment are moved downslope by slope wash. Tree-throw continues to operate proximally to the Anakeesta Ridge slide scars. Additionally, tree-throw is present at every breach of a ridge crest in the study area and is common along the unfailed slopes of Anakeesta Ridge. In terms of biogenic transport, tree-throw importance is clearly expressed by the microtopography created by decaying tree-throw mounds. Recent slide scar retreat has been nonexistent in some areas, primarily side-slopes, whereas retreat has been as high as 37.2 em over a 7 month period in scar head areas. Log jams act as an effective debris dam in slide-track constrictions until that time when the logs are structurally weakened by decay and can be overcome by a new debris torrent. The logs then become part of the ensuing debris slide and contribute to vegetal debris in the fan. Periodic aerial reconnaissance of areas of interest may be sufficient to detect incipient slide development. The Anakeesta Ridge slides have developed through head ward erosion; this was easily tracked through sequential aerial photographs. Incipient slides are followed by additional, ongoing sliding. The compound slide scars on Anakeesta Ridge have increased in area and volume: 4,300 m2 and 1790 m3 in 1953 to 128,000 m2 and 84,100 m3 in 1987. The scar head and upper slide track areas are the primary debris volume contributors. Anakeesta Ridge is in a stage of accumulation in the mid-slope regions of the slide scars. The scar heads continue to erode headwardly, supplying material to the lower regions of the scar. Aerial imagery for Anakeesta Ridge indicates that slope failure is initiated in the mid-slope region. The upper slope segments average 43.4°,the mid-slope segments average 33.6° and the lower slope segments average 25.6°. Anakeesta phyllite has an abundance of release surfaces in the form of cleavage planes, joints, faults and bedding planes. The chute morphology is characterized by wedge failure planes as formed by the intersection of cleavage/bedding and joints. Direct shear testing of Anakeesta phyllite yielded an internal friction angle (phi) of 58.2° and a cohesion (c) of 6134 pounds/foot2. Utilizing these values, and the slope and failure plane orientations, a factor of safety range from 1.19 to 2.53 was generated. A high percentage of the draws about Anakeesta Ridge are associated with debris sliding activity. Debris fans along the ridge, vary in approximate minimum dates of origin from 1749 to 1971 as determined from dendrochronological data. Tree coring data yields dates which range from 6 to 87 years between events and average 13.8 years. This length of time may represent the average amount of time required for slope ripening (weathering, accumulation) to occur, the time between major precipitation events or a combination of the two. Anakeesta Ridge slope instability is due to the coincidence of weathering, regolith accumulation, tree levering against shallow root networks, and high precipitation events. Slides will continue to cause problems along U.S. 441, therefore a need exists for locational and temporal predictors and a precise accounting of slide localization factors. Debris sliding cannot be prevented, however, hazardous areas can be delineated, and risks be assessed. In this way, landslides will not be studied simply for forensic purposes. A debris slide in a particular area is not a one-time event. Portions of Anakeesta Ridge have failed in the past, and under the present climatic regime, will continue to do so. Landslide potential is limited only by the availability of excess water, steep slopes and material

    Periods, Organized (PeriodO): A gazetteer of period assertions for linking and visualizing periodized data

    Get PDF
    The PeriodO project seeks to create an online gazetteer of authoritative assertions about the chronological and geographic extent of historical and archaeological periods. Starting with a trial dataset related to Classical antiquity, this gazetteer will combine period thesauri used by museums and cultural heritage bodies with published assertions about the dates and locations of periods in authoritative print sources. These assertions will be modeled in a Linked Data format (JSON-LD, a serialization of RDF). They will be given Uniform Resource Identifiers (URIs) and served from a public GitHub repository, where they can act as a shared reference point to describe data in datasets with periodized information. We will also create a search and visualization tool to view the temporal and geographic extent of an assertion and compare it with others. Authoritative users will be able to add their own period assertions

    Rapid Targeted Gene Disruption in Bacillus Anthracis

    Get PDF
    Anthrax is a zoonotic disease recognized to affect herbivores since Biblical times and has the widest range of susceptible host species of any known pathogen. The ease with which the bacterium can be weaponized and its recent deliberate use as an agent of terror, have highlighted the importance of gaining a deeper understanding and effective countermeasures for this important pathogen. High quality sequence data has opened the possibility of systematic dissection of how genes distributed on both the bacterial chromosome and associated plasmids have made it such a successful pathogen. However, low transformation efficiency and relatively few genetic tools for chromosomal manipulation have hampered full interrogation of its genome. Results: Group II introns have been developed into an efficient tool for site-specific gene inactivation in several organisms. We have adapted group II intron targeting technology for application in Bacillus anthracis and generated vectors that permit gene inactivation through group II intron insertion. The vectors developed permit screening for the desired insertion through PCR or direct selection of intron insertions using a selection scheme that activates a kanamycin resistance marker upon successful intron insertion. Conclusions: The design and vector construction described here provides a useful tool for high throughput experimental interrogation of the Bacillus anthracis genome and will benefit efforts to develop improved vaccines and therapeutics.Chem-Bio Diagnostics program from the Department of Defense Chemical and Biological Defense program through the Defense Threat Reduction Agency (DTRA) B102387MNIH GM037949Welch Foundation F-1607Cellular and Molecular Biolog

    A Simple Nonparametric Estimator for the Distribution of Random Coefficients

    Get PDF
    We propose a simple nonparametric mixtures estimator for recovering the joint distribution of parameter heterogeneity in economic models, such as the random coefficients logit. The estimator is based on linear regression subject to linear inequality constraints, and is robust, easy to program and computationally attractive compared to alternative estimators for random coefficient models. We prove consistency and provide the rate of convergence under deterministic and stochastic choices for the sieve approximating space. We present a Monte Carlo study and an empirical application to dynamic programming discrete choice with a serially-correlated unobserved state variable.

    Quantum-assisted quantum compiling

    Get PDF
    Compiling quantum algorithms for near-term quantum computers (accounting for connectivity and native gate alphabets) is a major challenge that has received significant attention both by industry and academia. Avoiding the exponential overhead of classical simulation of quantum dynamics will allow compilation of larger algorithms, and a strategy for this is to evaluate an algorithm's cost on a quantum computer. To this end, we propose a variational hybrid quantum-classical algorithm called quantum-assisted quantum compiling (QAQC). In QAQC, we use the overlap between a target unitary UU and a trainable unitary VV as the cost function to be evaluated on the quantum computer. More precisely, to ensure that QAQC scales well with problem size, our cost involves not only the global overlap Tr(VU){\rm Tr} (V^\dagger U) but also the local overlaps with respect to individual qubits. We introduce novel short-depth quantum circuits to quantify the terms in our cost function, and we prove that our cost cannot be efficiently approximated with a classical algorithm under reasonable complexity assumptions. We present both gradient-free and gradient-based approaches to minimizing this cost. As a demonstration of QAQC, we compile various one-qubit gates on IBM's and Rigetti's quantum computers into their respective native gate alphabets. Furthermore, we successfully simulate QAQC up to a problem size of 9 qubits, and these simulations highlight both the scalability of our cost function as well as the noise resilience of QAQC. Future applications of QAQC include algorithm depth compression, black-box compiling, noise mitigation, and benchmarking.Comment: 19 + 10 pages, 14 figures. Added larger scale implementations and proof that cost function is DQC1-har

    The SNF2-Family Member Fun30 Promotes Gene Silencing in Heterochromatic Loci

    Get PDF
    Chromatin regulates many key processes in the nucleus by controlling access to the underlying DNA. SNF2-like factors are ATP-driven enzymes that play key roles in the dynamics of chromatin by remodelling nucleosomes and other nucleoprotein complexes. Even simple eukaryotes such as yeast contain members of several subfamilies of SNF2-like factors. The FUN30/ETL1 subfamily of SNF2 remodellers is conserved from yeasts to humans, but is poorly characterized. We show that the deletion of FUN30 leads to sensitivity to the topoisomerase I poison camptothecin and to severe cell cycle progression defects when the Orc5 subunit is mutated. We demonstrate a role of FUN30 in promoting silencing in the heterochromatin-like mating type locus HMR, telomeres and the rDNA repeats. Chromatin immunoprecipitation experiments demonstrate that Fun30 binds at the boundary element of the silent HMR and within the silent HMR. Mapping of nucleosomes in vivo using micrococcal nuclease demonstrates that deletion of FUN30 leads to changes of the chromatin structure at the boundary element. A point mutation in the ATP-binding site abrogates the silencing function of Fun30 as well as its toxicity upon overexpression, indicating that the ATPase activity is essential for these roles of Fun30. We identify by amino acid sequence analysis a putative CUE motif as a feature of FUN30/ETL1 factors and show that this motif assists Fun30 activity. Our work suggests that Fun30 is directly involved in silencing by regulating the chromatin structure within or around silent loci

    Sensitivity Analysis of Hybrid Propulsion Transportation System for Human Mars Expeditions

    Get PDF
    The National Aeronautics and Space Administration continues to develop and refine various transportation options to successfully field a human Mars campaign. One of these transportation options is the Hybrid Transportation System which utilizes both solar electric propulsion and chemical propulsion. The Hybrid propulsion system utilizes chemical propulsion to perform high thrust maneuvers, where the delta-V is most optimal when ap- plied to save time and to leverage the Oberth effect. It then utilizes solar electric propulsion to augment the chemical burns throughout the interplanetary trajectory. This eliminates the need for the development of two separate vehicles for crew and cargo missions. Previous studies considered single point designs of the architecture, with fixed payload mass and propulsion system performance parameters. As the architecture matures, it is inevitable that the payload mass and the performance of the propulsion system will change. It is desirable to understand how these changes will impact the in-space transportation system's mass and power requirements. This study presents an in-depth sensitivity analysis of the Hybrid crew transportation system to payload mass growth and solar electric propulsion performance. This analysis is used to identify the breakpoints of the current architecture and to inform future architecture and campaign design decisions
    corecore