1,065 research outputs found

    SuperWIMP Cosmology and Collider Physics

    Full text link
    Dark matter may be composed of superWIMPs, superweakly-interacting massive particles produced in the late decays of other particles. We focus here on the well-motivated supersymmetric example of gravitino LSPs. Gravitino superWIMPs share several virtues with the well-known case of neutralino dark matter: they are present in the same supersymmetric frameworks (supergravity with R-parity conservation) and naturally have the desired relic density. In contrast to neutralinos, however, gravitino superWIMPs are impossible to detect by conventional dark matter searches, may explain an existing discrepancy in Big Bang nucleosynthesis, predict observable distortions in the cosmic microwave background, and imply spectacular signals at future particle colliders.Comment: 12 pages, to appear in the proceedings of SUSY2004, the 12th International Conference on Supersymmetry and Unification of Fundamental Interactions, Tsukuba, Japan, 17-23 June 200

    Supergravity with a Gravitino LSP

    Full text link
    We investigate supergravity models in which the lightest supersymmetric particle (LSP) is a stable gravitino. We assume that the next-lightest supersymmetric particle (NLSP) freezes out with its thermal relic density before decaying to the gravitino at time t ~ 10^4 s - 10^8 s. In contrast to studies that assume a fixed gravitino relic density, the thermal relic density assumption implies upper, not lower, bounds on superpartner masses, with important implications for particle colliders. We consider slepton, sneutrino, and neutralino NLSPs, and determine what superpartner masses are viable in all of these cases, applying CMB and electromagnetic and hadronic BBN constraints to the leading two- and three-body NLSP decays. Hadronic constraints have been neglected previously, but we find that they provide the most stringent constraints in much of the natural parameter space. We then discuss the collider phenomenology of supergravity with a gravitino LSP. We find that colliders may provide important insights to clarify BBN and the thermal history of the Universe below temperatures around 10 GeV and may even provide precise measurements of the gravitino's mass and couplings.Comment: 24 pages, updated figures and minor changes, version to appear in Phys.Rev.

    Adaptive Evolution of Conserved Noncoding Elements in Mammals

    Get PDF
    Conserved noncoding elements (CNCs) are an abundant feature of vertebrate genomes. Some CNCs have been shown to act as cis-regulatory modules, but the function of most CNCs remains unclear. To study the evolution of CNCs, we have developed a statistical method called the “shared rates test” to identify CNCs that show significant variation in substitution rates across branches of a phylogenetic tree. We report an application of this method to alignments of 98,910 CNCs from the human, chimpanzee, dog, mouse, and rat genomes. We find that ∼68% of CNCs evolve according to a null model where, for each CNC, a single parameter models the level of constraint acting throughout the phylogeny linking these five species. The remaining ∼32% of CNCs show departures from the basic model including speed-ups and slow-downs on particular branches and occasionally multiple rate changes on different branches. We find that a subset of the significant CNCs have evolved significantly faster than the local neutral rate on a particular branch, providing strong evidence for adaptive evolution in these CNCs. The distribution of these signals on the phylogeny suggests that adaptive evolution of CNCs occurs in occasional short bursts of evolution. Our analyses suggest a large set of promising targets for future functional studies of adaptation

    Differential expression analysis with global network adjustment

    Get PDF
    <p>Background: Large-scale chromosomal deletions or other non-specific perturbations of the transcriptome can alter the expression of hundreds or thousands of genes, and it is of biological interest to understand which genes are most profoundly affected. We present a method for predicting a gene’s expression as a function of other genes thereby accounting for the effect of transcriptional regulation that confounds the identification of genes differentially expressed relative to a regulatory network. The challenge in constructing such models is that the number of possible regulator transcripts within a global network is on the order of thousands, and the number of biological samples is typically on the order of 10. Nevertheless, there are large gene expression databases that can be used to construct networks that could be helpful in modeling transcriptional regulation in smaller experiments.</p> <p>Results: We demonstrate a type of penalized regression model that can be estimated from large gene expression databases, and then applied to smaller experiments. The ridge parameter is selected by minimizing the cross-validation error of the predictions in the independent out-sample. This tends to increase the model stability and leads to a much greater degree of parameter shrinkage, but the resulting biased estimation is mitigated by a second round of regression. Nevertheless, the proposed computationally efficient “over-shrinkage” method outperforms previously used LASSO-based techniques. In two independent datasets, we find that the median proportion of explained variability in expression is approximately 25%, and this results in a substantial increase in the signal-to-noise ratio allowing more powerful inferences on differential gene expression leading to biologically intuitive findings. We also show that a large proportion of gene dependencies are conditional on the biological state, which would be impossible with standard differential expression methods.</p> <p>Conclusions: By adjusting for the effects of the global network on individual genes, both the sensitivity and reliability of differential expression measures are greatly improved.</p&gt

    PAMELA and FERMI-LAT limits on the neutralino-chargino mass degeneracy

    Full text link
    Searches for Dark Matter (DM) particles with indirect detection techniques have reached important milestones with the precise measurements of the anti-proton and gamma-ray spectra, notably by the PAMELA and FERMI-LAT experiments. While the gamma-ray results have been used to test the thermal Dark Matter hypothesis and constrain the Dark Matter annihilation cross section into Standard Model (SM) particles, the anti-proton flux measured by the PAMELA experiment remains relatively unexploited. Here we show that the latter can be used to set a constraint on the neutralino-chargino mass difference. To illustrate our point we use a Supersymmetric model in which the gauginos are light, the sfermions are heavy and the Lightest Supersymmetric Particle (LSP) is the neutralino. In this framework the W^+ W^- production is expected to be significant, thus leading to large anti-proton and gamma-ray fluxes. After determining a generic limit on the Dark Matter pair annihilation cross section into W^+ W^- from the anti-proton data only, we show that one can constrain scenarios in which the neutralino-chargino mass difference is as large as ~ 20 GeV for a mixed neutralino (and intermediate choices of the anti-proton propagation scheme). This result is consistent with the limit obtained by using the FERMI-LAT data. As a result, we can safely rule out the pure wino neutralino hypothesis if it is lighter than 450 GeV and constitutes all the Dark Matter.Comment: 22page

    ImageNet Large Scale Visual Recognition Challenge

    Get PDF
    The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the five years of the challenge, and propose future directions and improvements.Comment: 43 pages, 16 figures. v3 includes additional comparisons with PASCAL VOC (per-category comparisons in Table 3, distribution of localization difficulty in Fig 16), a list of queries used for obtaining object detection images (Appendix C), and some additional reference
    corecore