81 research outputs found

    Supersymmetric dark matter in the light of LEP

    Get PDF
    The negative outcome of searches for supersymmetry performed at LEP have been used to derive indirect constraints on the parameters of the most plausible models for cold dark matter based on supersymmetric extensions of the Standard Model. The main results are summarized.Comment: 3 pages, 1 figure, to be published in the proceedings of the XIII Italian meeting on Physics at LEP, LEPTRE, Rome, April 200

    Dark Matter and SUSY: LEP results

    Full text link
    The negative outcome of searches for supersymmetry performed at LEP have been used to derive indirect constraints on the parameters of the most plausible supersymmetric candidates for cold dark matter, in particular for the lightest neutralino. We review the basic ideas leading to the present lower limit on the lightest neutralino mass of about 37 GeV, with emphasis on the underlying assumptions.Comment: 12 pages, 8 figures, to appear in the proceedings of Vulcano Workshop 2000: Frontier Objects In Astrophysics And Particle Physics, 22-27 May 2000, Vulcano, Ital

    Selected topics from non-Higgs searches at LEP

    Get PDF
    Extensive searches for new phenomena have been performed at LEP. The principal aspects and results of those not related to Higgs bosons are reviewed here.Comment: 4 pages, 4 figures, presented at "XXXVIIth Rencontres de Moriond on QCD and High Energy Hadronic Interactions", Les Arcs, France, 16-23 March 200

    Search for SUSY with R-parity violation at LEP

    Full text link
    Searches for supersymmetry at LEP allowing for R-parity violation are reviewed. The results are compared with the R-parity conserving scenarios.Comment: 8 pages, 8 figures, to be included in the proceedings of the 29th "International Conference on High Energy Physics", Vancouver, 199

    Supersymmetric Dark Matter in the Light of LEP and the Tevatron Collider

    Get PDF
    We analyze the accelerator constraints on the parameter space of the Minimal Supersymmetric extension of the Standard Model, comparing those now available from LEP II and anticipating the likely sensitivity of Tevatron Run II. The most important limits are those from searches for charginos, neutralinos and Higgs bosons at LEP, and searches for stop squarks, charginos and neutralinos at the Tevatron Collider. We also incorporate the constraints derived from b --> s + gamma decay, and discuss the relevance of charge- and colour-breaking minima in the effective potential. We combine and compare the different constraints on the Higgs-mixing parameter mu, the gaugino-mass parameter m_{1/2} and the scalar-mass parameter m0, incorporating radiative corrections to the physical particle masses. We focus on the resulting limitations on supersymmetric dark matter, assumed to be the lightest neutralino, incorporating coannihilation effects in the calculation of the relic abundance. We find that m_chi > 51 GeV and tan(beta) > 2.2 if all soft supersymmetry-breaking scalar masses are universal, including those of the Higgs bosons, and that these limits weaken to m_chi > 46 GeV and tan(beta) > 1.9 if non-universal scalar masses are allowed. Light neutralino dark matter cannot be primarily Higgsino in composition.Comment: 39 pages in LaTeX, including 44 encapsulated postscript figure

    Charginos and Neutralinos in the Light of Radiative Corrections: Sealing the Fate of Higgsino Dark Matter

    Get PDF
    We analyze the LEP constraints from searches for charginos χ±\chi^{\pm} and neutralinos χi\chi_i, taking into account radiative corrections to the relations between their masses and the underlying Higgs-mixing and gaugino-mass parameters μ,m1/2\mu, m_{1/2} and the trilinear mass parameter AtA_t. Whilst radiative corrections do not alter the excluded domain in mχ±m_{\chi^{\pm}} as a function of mχ±mχm_{\chi^{\pm}} - m_{\chi}, its mapping into the μ,m1/2\mu, m_{1/2} plane is altered. We update our previous lower limits on the mass of gaugino dark matter and on tanβ\beta, the ratio of Higgs vacuum expectation values, in the light of the latest LEP data and these radiative corrections. We also discuss the viability of Higgsino dark matter, incorporating co-annihilation effects into the calculation of the Higgsino relic abundance. We find that Higgsino dark matter is viable for only a very limited range of μ\mu and m1/2m_{1/2}, which will be explored completely by upcoming LEP runs.Comment: Version to appear in Phys. Rev. D., 21 pages in LateX, including 10 encapsulated postscript figures; uses epsf.sty.; Figures modified (one deleted), conclusions unchange

    The Need for a Versioned Data Analysis Software Environment

    Full text link
    Scientific results in high-energy physics and in many other fields often rely on complex software stacks. In order to support reproducibility and scrutiny of the results, it is good practice to use open source software and to cite software packages and versions. With ever-growing complexity of scientific software on one side and with IT life-cycles of only a few years on the other side, however, it turns out that despite source code availability the setup and the validation of a minimal usable analysis environment can easily become prohibitively expensive. We argue that there is a substantial gap between merely having access to versioned source code and the ability to create a data analysis runtime environment. In order to preserve all the different variants of the data analysis runtime environment, we developed a snapshotting file system optimized for software distribution. We report on our experience in preserving the analysis environment for high-energy physics such as the software landscape used to discover the Higgs boson at the Large Hadron Collider

    Supersymmetric Dark Matter and the Energy of a Linear Electron-Positron Collider

    Get PDF
    We suggest that supersymmetric dark matter be used to set the energy scale of a linear e+ee^+ e^- collider. Assuming that the lightest supersymmetric particle (LSP) is a stable neutralino χ\chi, as in many incarnations of the MSSM with conserved R parity, previous calculations that include coannihilation effects have delineated the region of the (m1/2,m0)(m_{1/2}, m_0) plane where the LSP cosmological relic density lies in the preferred range 0.1 \la \Omega_{\chi} h^2 \la 0.3. We evaluate here the total cross section for e+ee^+ e^- \to visible pairs of supersymmetric particles, for different values of m1/2m_{1/2} and m0m_0, and investigate how much of the dark matter region can be explored by e+ee^+ e^- colliders with different centre-of-mass energies ECME_{CM}. We find that a collider with ECM=500E_{CM} = 500 GeV or 1 TeV can only explore part of the cosmological region, and that a collider with ECM=1.5E_{CM} = 1.5 TeV with sufficient luminosity can explore all of the supersymmetric dark matter region

    PROOF as a Service on the Cloud: a Virtual Analysis Facility based on the CernVM ecosystem

    Full text link
    PROOF, the Parallel ROOT Facility, is a ROOT-based framework which enables interactive parallelism for event-based tasks on a cluster of computing nodes. Although PROOF can be used simply from within a ROOT session with no additional requirements, deploying and configuring a PROOF cluster used to be not as straightforward. Recently great efforts have been spent to make the provisioning of generic PROOF analysis facilities with zero configuration, with the added advantages of positively affecting both stability and scalability, making the deployment operations feasible even for the end user. Since a growing amount of large-scale computing resources are nowadays made available by Cloud providers in a virtualized form, we have developed the Virtual PROOF-based Analysis Facility: a cluster appliance combining the solid CernVM ecosystem and PoD (PROOF on Demand), ready to be deployed on the Cloud and leveraging some peculiar Cloud features such as elasticity. We will show how this approach is effective both for sysadmins, who will have little or no configuration to do to run it on their Clouds, and for the end users, who are ultimately in full control of their PROOF cluster and can even easily restart it by themselves in the unfortunate event of a major failure. We will also show how elasticity leads to a more optimal and uniform usage of Cloud resources.Comment: Talk from Computing in High Energy and Nuclear Physics 2013 (CHEP2013), Amsterdam (NL), October 2013, 7 pages, 4 figure

    ROOT - A C++ Framework for Petabyte Data Storage, Statistical Analysis and Visualization

    Full text link
    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, ROOT offers packages for complex data modeling and fitting, as well as multivariate classification based on machine learning techniques. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way
    corecore