1,566 research outputs found

    Multivariate discrimination and the Higgs + W/Z search

    Get PDF
    A systematic method for optimizing multivariate discriminants is developed and applied to the important example of a light Higgs boson search at the Tevatron and the LHC. The Significance Improvement Characteristic (SIC), defined as the signal efficiency of a cut or multivariate discriminant divided by the square root of the background efficiency, is shown to be an extremely powerful visualization tool. SIC curves demonstrate numerical instabilities in the multivariate discriminants, show convergence as the number of variables is increased, and display the sensitivity to the optimal cut values. For our application, we concentrate on Higgs boson production in association with a W or Z boson with H -> bb and compare to the irreducible standard model background, Z/W + bb. We explore thousands of experimentally motivated, physically motivated, and unmotivated single variable discriminants. Along with the standard kinematic variables, a number of new ones, such as twist, are described which should have applicability to many processes. We find that some single variables, such as the pull angle, are weak discriminants, but when combined with others they provide important marginal improvement. We also find that multiple Higgs boson-candidate mass measures, such as from mild and aggressively trimmed jets, when combined may provide additional discriminating power. Comparing the significance improvement from our variables to those used in recent CDF and DZero searches, we find that a 10-20% improvement in significance against Z/W + bb is possible. Our analysis also suggests that the H + W/Z channel with H -> bb is also viable at the LHC, without requiring a hard cut on the W/Z transverse momentum.Comment: 41 pages, 5 tables, 29 figure

    The use of biomedicine, complementary and alternative medicine, and ethnomedicine for the treatment of epilepsy among people of South Asian origin in the UK

    Get PDF
    Studies have shown that a significant proportion of people with epilepsy use complementary and alternative medicine (CAM). CAM use is known to vary between different ethnic groups and cultural contexts; however, little attention has been devoted to inter-ethnic differences within the UK population. We studied the use of biomedicine, complementary and alternative medicine, and ethnomedicine in a sample of people with epilepsy of South Asian origin living in the north of England. Interviews were conducted with 30 people of South Asian origin and 16 carers drawn from a sampling frame of patients over 18 years old with epilepsy, compiled from epilepsy registers and hospital databases. All interviews were tape-recorded, translated if required and transcribed. A framework approach was adopted to analyse the data. All those interviewed were taking conventional anti-epileptic drugs. Most had also sought help from traditional South Asian practitioners, but only two people had tried conventional CAM. Decisions to consult a traditional healer were taken by families rather than by individuals with epilepsy. Those who made the decision to consult a traditional healer were usually older family members and their motivations and perceptions of safety and efficacy often differed from those of the recipients of the treatment. No-one had discussed the use of traditional therapies with their doctor. The patterns observed in the UK mirrored those reported among people with epilepsy in India and Pakistan. The health care-seeking behaviour of study participants, although mainly confined within the ethnomedicine sector, shared much in common with that of people who use global CAM. The appeal of traditional therapies lay in their religious and moral legitimacy within the South Asian community, especially to the older generation who were disproportionately influential in the determination of treatment choices. As a second generation made up of people of Pakistani origin born in the UK reach the age when they are the influential decision makers in their families, resort to traditional therapies may decline. People had long experience of navigating plural systems of health care and avoided potential conflict by maintaining strict separation between different sectors. Health care practitioners need to approach these issues with sensitivity and to regard traditional healers as potential allies, rather than competitors or quacks

    Laser Cooling of Optically Trapped Molecules

    Full text link
    Calcium monofluoride (CaF) molecules are loaded into an optical dipole trap (ODT) and subsequently laser cooled within the trap. Starting with magneto-optical trapping, we sub-Doppler cool CaF and then load 150(30)150(30) CaF molecules into an ODT. Enhanced loading by a factor of five is obtained when sub-Doppler cooling light and trapping light are on simultaneously. For trapped molecules, we directly observe efficient sub-Doppler cooling to a temperature of 60(5)60(5) μK\mu\text{K}. The trapped molecular density of 8(2)×1078(2)\times10^7 cm3^{-3} is an order of magnitude greater than in the initial sub-Doppler cooled sample. The trap lifetime of 750(40) ms is dominated by background gas collisions.Comment: 5 pages, 5 figure

    NLO Higgs boson production plus one and two jets using the POWHEG BOX, MadGraph4 and MCFM

    Get PDF
    We present a next-to-leading order calculation of Higgs boson production plus one and two jets via gluon fusion interfaced to shower Monte Carlo programs, implemented according to the POWHEG method. For this implementation we have used a new interface of the POWHEG BOX with MadGraph4, that generates the codes for generic Born and real processes automatically. The virtual corrections have been taken from the MCFM code. We carry out a simple phenomenological study of our generators, comparing them among each other and with fixed next-to-leading order results.Comment: 27 pages, 21 figure

    Endemicity of Zoonotic Diseases in Pigs and Humans in Lowland and Upland Lao PDR: Identification of Socio-cultural Risk Factors

    Get PDF
    In Lao People's Democratic Republic pigs are kept in close contact with families. Human risk of infection with pig zoonoses arises from direct contact and consumption of unsafe pig products. This cross-sectional study was conducted in Luang Prabang (north) and Savannakhet (central-south) Provinces. A total of 59 villages, 895 humans and 647 pigs were sampled and serologically tested for zoonotic pathogens including: hepatitis E virus (HEV), Japanese encephalitis virus (JEV) and Trichinella spiralis; In addition, human sera were tested for Taenia spp. and cysticercosis. Seroprevalence of zoonotic pathogens in humans was high for HEV (Luang Prabang: 48.6%, Savannakhet: 77.7%) and T. spiralis (Luang Prabang: 59.0%, Savannakhet: 40.5%), and lower for JEV (around 5%), Taenia spp. (around 3%) and cysticercosis (Luang Prabang: 6.1, Savannakhet 1.5%). Multiple correspondence analysis and hierarchical clustering of principal components was performed on descriptive data of human hygiene practices, contact with pigs and consumption of pork products. Three clusters were identified: Cluster 1 had low pig contact and good hygiene practices, but had higher risk of T. spiralis. Most people in cluster 2 were involved in pig slaughter (83.7%), handled raw meat or offal (99.4%) and consumed raw pigs' blood (76.4%). Compared to cluster 1, cluster 2 had increased odds of testing seropositive for HEV and JEV. Cluster 3 had the lowest sanitation access and had the highest risk of HEV, cysticercosis and Taenia spp. Farmers which kept their pigs tethered (as opposed to penned) and disposed of manure in water sources had 0.85 (95% CI: 0.18 to 0.91) and 2.39 (95% CI: 1.07 to 5.34) times the odds of having pigs test seropositive for HEV, respectively. The results have been used to identify entry-points for intervention and management strategies to reduce disease exposure in humans and pigs, informing control activities in a cysticercosis hyper-endemic village

    A Model-Based Analysis of GC-Biased Gene Conversion in the Human and Chimpanzee Genomes

    Get PDF
    GC-biased gene conversion (gBGC) is a recombination-associated process that favors the fixation of G/C alleles over A/T alleles. In mammals, gBGC is hypothesized to contribute to variation in GC content, rapidly evolving sequences, and the fixation of deleterious mutations, but its prevalence and general functional consequences remain poorly understood. gBGC is difficult to incorporate into models of molecular evolution and so far has primarily been studied using summary statistics from genomic comparisons. Here, we introduce a new probabilistic model that captures the joint effects of natural selection and gBGC on nucleotide substitution patterns, while allowing for correlations along the genome in these effects. We implemented our model in a computer program, called phastBias, that can accurately detect gBGC tracts about 1 kilobase or longer in simulated sequence alignments. When applied to real primate genome sequences, phastBias predicts gBGC tracts that cover roughly 0.3% of the human and chimpanzee genomes and account for 1.2% of human-chimpanzee nucleotide differences. These tracts fall in clusters, particularly in subtelomeric regions; they are enriched for recombination hotspots and fast-evolving sequences; and they display an ongoing fixation preference for G and C alleles. They are also significantly enriched for disease-associated polymorphisms, suggesting that they contribute to the fixation of deleterious alleles. The gBGC tracts provide a unique window into historical recombination processes along the human and chimpanzee lineages. They supply additional evidence of long-term conservation of megabase-scale recombination rates accompanied by rapid turnover of hotspots. Together, these findings shed new light on the evolutionary, functional, and disease implications of gBGC. The phastBias program and our predicted tracts are freely available. © 2013 Capra et al

    Advancing Tests of Relativistic Gravity via Laser Ranging to Phobos

    Get PDF
    Phobos Laser Ranging (PLR) is a concept for a space mission designed to advance tests of relativistic gravity in the solar system. PLR's primary objective is to measure the curvature of space around the Sun, represented by the Eddington parameter γ\gamma, with an accuracy of two parts in 10710^7, thereby improving today's best result by two orders of magnitude. Other mission goals include measurements of the time-rate-of-change of the gravitational constant, GG and of the gravitational inverse square law at 1.5 AU distances--with up to two orders-of-magnitude improvement for each. The science parameters will be estimated using laser ranging measurements of the distance between an Earth station and an active laser transponder on Phobos capable of reaching mm-level range resolution. A transponder on Phobos sending 0.25 mJ, 10 ps pulses at 1 kHz, and receiving asynchronous 1 kHz pulses from earth via a 12 cm aperture will permit links that even at maximum range will exceed a photon per second. A total measurement precision of 50 ps demands a few hundred photons to average to 1 mm (3.3 ps) range precision. Existing satellite laser ranging (SLR) facilities--with appropriate augmentation--may be able to participate in PLR. Since Phobos' orbital period is about 8 hours, each observatory is guaranteed visibility of the Phobos instrument every Earth day. Given the current technology readiness level, PLR could be started in 2011 for launch in 2016 for 3 years of science operations. We discuss the PLR's science objectives, instrument, and mission design. We also present the details of science simulations performed to support the mission's primary objectives.Comment: 25 pages, 10 figures, 9 table

    A Triple Protostar System Formed via Fragmentation of a Gravitationally Unstable Disk

    Get PDF
    Binary and multiple star systems are a frequent outcome of the star formation process, and as a result, almost half of all sun-like stars have at least one companion star. Theoretical studies indicate that there are two main pathways that can operate concurrently to form binary/multiple star systems: large scale fragmentation of turbulent gas cores and filaments or smaller scale fragmentation of a massive protostellar disk due to gravitational instability. Observational evidence for turbulent fragmentation on scales of >>1000~AU has recently emerged. Previous evidence for disk fragmentation was limited to inferences based on the separations of more-evolved pre-main sequence and protostellar multiple systems. The triple protostar system L1448 IRS3B is an ideal candidate to search for evidence of disk fragmentation. L1448 IRS3B is in an early phase of the star formation process, likely less than 150,000 years in age, and all protostars in the system are separated by <<200~AU. Here we report observations of dust and molecular gas emission that reveal a disk with spiral structure surrounding the three protostars. Two protostars near the center of the disk are separated by 61 AU, and a tertiary protostar is coincident with a spiral arm in the outer disk at a 183 AU separation. The inferred mass of the central pair of protostellar objects is \sim1 Msun_{sun}, while the disk surrounding the three protostars has a total mass of \sim0.30 M_{\sun}. The tertiary protostar itself has a minimum mass of \sim0.085 Msun_{sun}. We demonstrate that the disk around L1448 IRS3B appears susceptible to disk fragmentation at radii between 150~AU and 320~AU, overlapping with the location of the tertiary protostar. This is consistent with models for a protostellar disk that has recently undergone gravitational instability, spawning one or two companion stars.Comment: Published in Nature on Oct. 27th. 24 pages, 8 figure

    Climate Change Meets the Law of the Horse

    Get PDF
    The climate change policy debate has only recently turned its full attention to adaptation - how to address the impacts of climate change we have already begun to experience and that will likely increase over time. Legal scholars have in turn begun to explore how the many different fields of law will and should respond. During this nascent period, one overarching question has gone unexamined: how will the legal system as a whole organize around climate change adaptation? Will a new distinct field of climate change adaptation law and policy emerge, or will legal institutions simply work away at the problem through unrelated, duly self-contained fields, as in the famous Law of the Horse? This Article is the first to examine that question comprehensively, to move beyond thinking about the law and climate change adaptation to consider the law of climate change adaptation. Part I of the Article lays out our methodological premises and approach. Using a model we call Stationarity Assessment, Part I explores how legal fields are structured and sustained based on assumptions about the variability of natural, social, and economic conditions, and how disruptions to that regime of variability can lead to the emergence of new fields of law and policy. Case studies of environmental law and environmental justice demonstrate the model’s predictive power for the formation of new distinct legal regimes. Part II applies the Stationarity Assessment model to the topic of climate change adaptation, using a case study of a hypothetical coastal region and the potential for climate change impacts to disrupt relevant legal doctrines and institutions. We find that most fields of law appear capable of adapting effectively to climate change. In other words, without some active intervention, we expect the law and policy of climate change adaptation to follow the path of the Law of the Horse - a collection of fields independently adapting to climate change - rather than organically coalescing into a new distinct field. Part III explores why, notwithstanding this conclusion, it may still be desirable to seek a different trajectory. Focusing on the likelihood of systemic adaptation decisions with perverse, harmful results, we identify the potential benefits offered by intervening to shape a new and distinct field of climate change adaptation law and policy. Part IV then identifies the contours of such a field, exploring the distinct purposes of reducing vulnerability, ensuring resiliency, and safeguarding equity. These features provide the normative policy components for a law of climate change adaptation that would be more than just a Law of the Horse. This new field would not replace or supplant any existing field, however, as environmental law did with regard to nuisance law, and it would not be dominated by substantive doctrine. Rather, like the field of environmental justice, this new legal regime would serve as a holistic overlay across other fields to ensure more efficient, effective, and just climate change adaptation solutions
    corecore