954 research outputs found

    NLSP Gluino Search at the Tevatron and early LHC

    Full text link
    We investigate the collider phenomenology of gluino-bino co-annihilation scenario both at the Tevatron and 7 TeV LHC. This scenario can be realized, for example, in a class of realistic supersymmetric models with non-universal gaugino masses and t-b-\tau Yukawa unification. The NLSP gluino and LSP bino should be nearly degenerate in mass, so that the typical gluino search channels involving leptons or hard jets are not available. Consequently, the gluino can be lighter than various bounds on its mass from direct searches. We propose a new search for NLSP gluino involving multi-b final states, arising from the three-body decay \tilde{g}-> b\bar{b}\tilde{\chi}_1^0. We identify two realistic models with gluino mass of around 300 GeV for which the three-body decay is dominant, and show that a 4.5 \sigma observation sensitivity can be achieved at the Tevatron with an integrated luminosity of 10 fb^{-1}. For the 7 TeV LHC with 50 pb^{-1} of integrated luminosity, the number of signal events for the two models is O(10), to be compared with negligible SM background event.Comment: 14 pages, 4 figures and 3 tables, minor modifications made and accepted for publication in JHE

    A Standardised Procedure for Evaluating Creative Systems: Computational Creativity Evaluation Based on What it is to be Creative

    Get PDF
    Computational creativity is a flourishing research area, with a variety of creative systems being produced and developed. Creativity evaluation has not kept pace with system development with an evident lack of systematic evaluation of the creativity of these systems in the literature. This is partially due to difficulties in defining what it means for a computer to be creative; indeed, there is no consensus on this for human creativity, let alone its computational equivalent. This paper proposes a Standardised Procedure for Evaluating Creative Systems (SPECS). SPECS is a three-step process: stating what it means for a particular computational system to be creative, deriving and performing tests based on these statements. To assist this process, the paper offers a collection of key components of creativity, identified empirically from discussions of human and computational creativity. Using this approach, the SPECS methodology is demonstrated through a comparative case study evaluating computational creativity systems that improvise music

    QCD Corrections to Scalar Diquark Production at Hadron Colliders

    Get PDF
    We calculate the next-to-leading order QCD corrections to quark-quark annihilation to a scalar resonant state ("diquark") in a color representation of antitriplet or sextet at the Tevatron and LHC energies. At the LHC, we find the enhancement (K-factor) for the antitriplet diquark is typically about 1.31--1.35, and for the sextet diquark is about 1.22--1.32 for initial-state valence quarks. The full transverse-momentum spectrum for the diquarks is also calculated at the LHC by performing the soft gluon resummation to the leading logarithm and all orders in the strong coupling.Comment: 24 pages, 17 figure

    Increasing confidence and changing behaviors in primary care providers engaged in genetic counselling.

    Get PDF
    BackgroundScreening and counseling for genetic conditions is an increasingly important part of primary care practice, particularly given the paucity of genetic counselors in the United States. However, primary care physicians (PCPs) often have an inadequate understanding of evidence-based screening; communication approaches that encourage shared decision-making; ethical, legal, and social implication (ELSI) issues related to screening for genetic mutations; and the basics of clinical genetics. This study explored whether an interactive, web-based genetics curriculum directed at PCPs in non-academic primary care settings was superior at changing practice knowledge, attitudes, and behaviors when compared to a traditional educational approach, particularly when discussing common genetic conditions.MethodsOne hundred twenty one PCPs in California and Pennsylvania physician practices were randomized to either an Intervention Group (IG) or Control Group (CG). IG physicians completed a 6 h interactive web-based curriculum covering communication skills, basics of genetic testing, risk assessment, ELSI issues and practice behaviors. CG physicians were provided with a traditional approach to Continuing Medical Education (CME) (clinical review articles) offering equivalent information.ResultsPCPs in the Intervention Group showed greater increases in knowledge compared to the Control Group. Intervention PCPs were also more satisfied with the educational materials, and more confident in their genetics knowledge and skills compared to those receiving traditional CME materials. Intervention PCPs felt that the web-based curriculum covered medical management, genetics, and ELSI issues significantly better than did the Control Group, and in comparison with traditional curricula. The Intervention Group felt the online tools offered several advantages, and engaged in better shared decision making with standardized patients, however, there was no difference in behavior change between groups with regard to increases in ELSI discussions between PCPs and patients.ConclusionWhile our intervention was deemed more enjoyable, demonstrated significant factual learning and retention, and increased shared decision making practices, there were few differences in behavior changes around ELSI discussions. Unfortunately, barriers to implementing behavior change in clinical genetics is not unique to our intervention. Perhaps the missing element is that busy physicians need systems-level support to engage in meaningful discussions around genetics issues. The next step in promoting active engagement between doctors and patients may be to put into place the tools needed for PCPs to easily access the materials they need at the point-of-care to engage in joint discussions around clinical genetics

    Stainable hepatic iron in 341 African American adults at coroner/medical examiner autopsy

    Get PDF
    BACKGROUND: Results of previous autopsy studies indicate that increased hepatic iron stores or hepatic iron overload is common in African Americans dying in hospitals, but there are no reports of hepatic iron content in other cohorts of African Americans. METHODS: We investigated the prevalence of heavy liver iron deposition in African American adults. Using established histochemical criteria, we graded Perls' acid ferrocyanide-reactive iron in the hepatocytes and Kupffer cells of 341 consecutive African American adults who were autopsied in the coroner/medical examiner office. Heavy staining was defined as grade 3 or 4 hepatocyte iron or grade 3 Kupffer cell iron. RESULTS: There were 254 men and 85 women (mean age ± 1 SD: 44 ± 13 y vs. 48 ± 14 y, respectively; p = 0.0255); gender was unstated or unknown in two subjects. Approximately one-third of subjects died of natural causes. Heavy staining was observed in 10.2% of men and 4.7% of women. 23 subjects had heavy hepatocyte staining only, six had heavy Kupffer cell staining only, and one had a mixed pattern of heavy staining. 15 subjects had histories of chronic alcoholism; three had heavy staining confined to hepatocytes. We analyzed the relationships of three continuous variables (age at death in years, hepatocyte iron grade, Kupffer cell iron grade) and two categorical variables (sex, cause of death (natural and non-natural causes)) in all 341 subjects using a correlation matrix with Bonferroni correction. This revealed two positive correlations: hepatocyte with Kupffer cell iron grades (p < 0.01), and male sex with hepatocyte iron grade (p < 0.05). We also analyzed the relationship of steatosis, inflammation, and fibrosis/cirrhosis in 30 subjects with heavy iron staining using a correlation matrix with Bonferroni correction. There were significant positive correlations of steatosis with inflammation (r = 0.5641; p < 0.01), and of inflammation with fibrosis/cirrhosis (r = 0.6124; p < 0.01). CONCLUSIONS: The present results confirm and extend previous observations that heavy liver iron staining is relatively common in African Americans. The pertinence of these observations to genetic and acquired causes of iron overload in African Americans is discussed

    Left-right symmetry at LHC and precise 1-loop low energy data

    Get PDF
    Despite many tests, even the Minimal Manifest Left-Right Symmetric Model (MLRSM) has never been ultimately confirmed or falsified. LHC gives a new possibility to test directly the most conservative version of left-right symmetric models at so far not reachable energy scales. If we take into account precise limits on the model which come from low energy processes, like the muon decay, possible LHC signals are strongly limited through the correlations of parameters among heavy neutrinos, heavy gauge bosons and heavy Higgs particles. To illustrate the situation in the context of LHC, we consider the "golden" process ppe+Npp \to e^+ N. For instance, in a case of degenerate heavy neutrinos and heavy Higgs masses at 15 TeV (in agreement with FCNC bounds) we get σ(ppe+N)>10\sigma(pp \to e^+ N)>10 fb at s=14\sqrt{s}=14 TeV which is consistent with muon decay data for a very limited W2W_2 masses in the range (3008 GeV, 3040 GeV). Without restrictions coming from the muon data, W2W_2 masses would be in the range (1.0 TeV, 3.5 TeV). Influence of heavy Higgs particles themselves on the considered LHC process is negligible (the same is true for the light, SM neutral Higgs scalar analog). In the paper decay modes of the right-handed heavy gauge bosons and heavy neutrinos are also discussed. Both scenarios with typical see-saw light-heavy neutrino mixings and the mixings which are independent of heavy neutrino masses are considered. In the second case heavy neutrino decays to the heavy charged gauge bosons not necessarily dominate over decay modes which include only light, SM-like particles.Comment: 16 pages, 10 figs, KL-KS and new ATLAS limits taken into accoun

    Effectiveness analysis of resistance and tolerance to infection

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Tolerance and resistance provide animals with two distinct strategies to fight infectious pathogens and may exhibit different evolutionary dynamics. However, few studies have investigated these mechanisms in the case of animal diseases under commercial constraints.</p> <p>Methods</p> <p>The paper proposes a method to simultaneously describe (1) the dynamics of transmission of a contagious pathogen between animals, (2) the growth and death of the pathogen within infected hosts and (3) the effects on their performances. The effectiveness of increasing individual levels of tolerance and resistance is evaluated by the number of infected animals and the performance at the population level.</p> <p>Results</p> <p>The model is applied to a particular set of parameters and different combinations of values. Given these imputed values, it is shown that higher levels of individual tolerance should be more effective than increased levels of resistance in commercial populations. As a practical example, a method is proposed to measure levels of animal tolerance to bovine mastitis.</p> <p>Conclusions</p> <p>The model provides a general framework and some tools to maximize health and performances of a population under infection. Limits and assumptions of the model are clearly identified so it can be improved for different epidemiological settings.</p
    corecore