17 research outputs found

    Magnetization in superconducting corrector magnets and impact on luminosity-calibration scans in the Large Hadron Collider

    Full text link
    Superconducting accelerator magnets have a nonlinear dependence of field on current due to the magnetization associated to the iron or to persistent currents in the superconducting filaments. This also gives rise to hysteresis phenomena that create a dependence of the field on the powering history. Magnetization effects are of particular importance for luminosity-calibration scans in the Large Hadron Collider, during which a small number of Nb-Ti superconducting orbit correctors are excited at low field and with frequent flipping of the sign of the current ramp. This paper focuses on the analysis of special measurements carried out to estimate these nonlinear effects under the special cycling conditions used in this luminosity scans. For standard powering cycles, we evaluate the effect of the main magnetization loop; for complex operational schemes, magnetization-branch transitions occur that depend on the details of the current cycle. The modelling of these effects is not included in the magnetic-field prediction software currently implemented in the LHC control system; here we present an approach to predict the transitions between the main magnetization branches. The final aim is to estimate the impact of magnetic hysteresis on the accuracy of luminosity-calibration scans.Comment: To be submitted to The European Physical Journal - Plus (EPJ Plus), Document available on CERN Document Server (CDS

    Performance of the ALICE luminosity levelling software architecture in the Pb-Pb physics run

    Get PDF
    Luminosity leveling is performed in the ALICE experi-ment of the Large Hadron Collider (LHC) in order to limitthe event pile-up probability, and ensure a safe operation forthe detectors. It will be even more important during Run3 when 50 KHz Pb ion-Pb ion (Pb-Pb) collisions will bedelivered in IP2. On the ALICE side, it is handled by theALICE-LHC Interface project, which also ensures an onlinedata exchange between ALICE and the LHC. An automated luminosity leveling algorithm was developed for the proton-proton physics run, and was also deployed for the Pb-Pb run with some minor changes following experience gained. The algorithm is implemented in the SIMATIC WinCC SCADA environment, and determines the leveling step from measured beam parameters received from the LHC, and the luminosity recorded by ALICE. In this paper, the softwarearchitecture of the luminosity leveling software is presented,and the performance achieved during the Pb-Pb run and Vander Meer scans is discussed.peer-reviewe

    LHC Luminosity Performance

    No full text
    This thesis adresses several approaches with the common goal of assessing, understanding and improving the luminosity of the Large Hadron Collider (LHC). To better exploit existing margins for maximum luminosity while fulfilling the requirements of the LHC experiments, new techniques for luminosity levelling are studied and developed to an operational state, such as changing the crossing angle or ÎČ∗\beta^* (beam size) at the interaction points with the beams in collisions. In 2017 LHC operation, the crossing angle reduction in collisions improved the integrated luminosity by ∌2 fb−1\mathrm{\sim} 2\,\mathrm{fb^{-1}} (∌4 %\mathrm{\sim} 4\,\mathrm{\%} of the yearly production). For additional diagnostics, a new method for measuring beam sizes and orbits for each circulating bunch using the luminosity measurement during beam separation scans is shown. The results of these Emittance Scans improved the understanding of the LHC luminosity reach and of the orbit offsets introduced by beam-beam long-range effects

    TCT levelling strategy for Run 3

    No full text
    Presentation at the 260th LHC Collimation Working Group meeting, https://indico.cern.ch/event/1083339

    Enabling the ATLAS Experiment at the LHC for High Performance Computing

    No full text
    In this thesis, I studied the feasibility of running computer data analysis programs from the Worldwide LHC Computing Grid, in particular large-scale simulations of the ATLAS experiment at the CERN LHC, on current general purpose High Performance Computing (HPC) systems. An approach for integrating HPC systems into the Grid is proposed, which has been implemented and tested on the „Todi” HPC machine at the Swiss National Supercomputing Centre (CSCS). Over the course of the test, more than 500000 CPU-hours of processing time have been provided to ATLAS, which is roughly equivalent to the combined computing power of the two ATLAS clusters at the University of Bern. This showed that current HPC systems can be used to efficiently run large-scale simulations of the ATLAS detector and of the detected physics processes. As a first conclusion of my work, one can argue that, in perspective, running large-scale tasks on a few large machines might be more cost-effective than running on relatively small dedicated computing clusters. The second part of the thesis work covers a study of the discovery potential for supersymmetry (SUSY) by studying ATLAS events with one lepton, two b-jets and missing transverse momentum in the final state. By using flat-random distributed pMSSM models, I identified some models which could possibly lead to the discovery of SUSY by this specific channel

    Start collisions at 1m ÎČ∗ (MD 3349)

    No full text
    This document summarises the LHC Machine Development session on colliding at ÎČ∗ = 1 m and squeezing with colliding beams to ÎČ∗ = 25 cm in IP 1 and 5 using the capabilities of the LumiServer. To do so, the beams were brought into collision at the end of the ramp without playing the regular squeeze to 30 cm. Once in collision, the ÎČ∗ levelling functionality of the LumiServer was used to squeeze and de-squeeze the beams in two cycles between ÎČ∗ = 1 m and ÎČ∗ = 25 cm. After each squeeze step, a luminosity optimization scan was performed in IP 1 and 5 to assess the orbit stability, ensuring that the beams remained in collisions

    A VdM Sequence File Web Application

    No full text
    To perform Van der Meer scans at the LHC, the experimental collaborations have to supply a text file containing a command sequence specifying the beam positions required. Until now these had to be made by hand and sent by email, both very tedious and error-prone procedures. This project has been about developing a web-based application that addresses the process by combining both tasks into one simple UI that fits in a single browser window. By integrating GitLab into the application the distribution of files can be done directly from the application by accessing a global repository. The application also features a dynamic code editor that recognises errors, auto-generates line numbers, and has structure sensitive auto-completion

    Magnetization in superconducting corrector magnets and impact on luminosity-calibration scans in the Large Hadron Collider

    No full text
    International audienceSuperconducting accelerator magnets have a nonlinear dependence of field on current due to the magnetization associated to the iron or to persistent currents in the superconducting filaments. This also gives rise to hysteresis phenomena that create a dependence of the field on the powering history. Magnetization effects are of particular importance for luminosity-calibration scans in the Large Hadron Collider, during which a small number of Nb-Ti superconducting orbit correctors are excited at low field and with frequent flipping of the sign of the current ramp. This paper focuses on the analysis of special measurements carried out to estimate these nonlinear effects under the special cycling conditions used in this luminosity scans. For standard powering cycles, we evaluate the effect of the main magnetization loop; for complex operational schemes, magnetization-branch transitions occur that depend on the details of the current cycle. The modelling of these effects is not included in the magnetic-field prediction software currently implemented in the LHC control system; here we present an approach to predict the transitions between the main magnetization branches. The final aim is to estimate the impact of magnetic hysteresis on the accuracy of luminosity-calibration scans

    beta* leveling with telescopic ATS squeeze (MD 2410)

    No full text
    Luminosity leveling by beta* is the baseline operational scenario of HL-LHC, and this leveling technique may be used in 2018 or during run~3 depending on the beam parameters and beta* range. During this MD beta*leveling was commissioned successfully for the first time with the telescopic squeeze over the beta* range of 40 cm to 30 cm. A novel beta* leveling controls technique based on a modification of the LSA trim was also tested during the MD

    An upgraded luminosity leveling procedure for the ALICE Experiment

    No full text
    The Large Hadron Collider (LHC) Interface project (LHC_IF) ensures that the A Large Ion Collider Experiment (ALICE) can operate congruently and safely with the LHC in different machine configurations and beam modes. Measurements in proton–proton (pp) collisions are vital for ALICE and serve as a reference to calibrate the Pb ion–Pb ion (Pb–Pb) measurements. However, during the pp operation to limit the event pile up (number of pp collisions per bunch crossing) and ensure a high-quality data sample, the instantaneous luminosity must be limited in ALICE to 5×1030cm−2 s−1{5\times 10^{30} {\text {cm}}^{-2}\,\mathrm {{s}}^{-1}} . This is achieved by applying a beam–beam separation in the separation plane up to several beam σ{\sigma } (beam size unit), which is known as luminosity leveling. Given that the luminosity in a collider is expressed by a well-known formula in terms of parameters such as the beam separation, bunch intensities, number of colliding bunches, and beam size, then it is possible to determine the beam separation if the other parameters are known for both the target luminosity and the measured instantaneous luminosity. From the difference of the two beam separations, different step sizes spanning from 0.5 σ{\sigma } down to 0.025 σ{\sigma } are calculated and transmitted to LHC that accordingly steers the beams until the instantaneous luminosity smoothly reaches the target within ± 5%. In this note, the results achieved with the new beam-leveling procedure over almost a year of operation, as well as comparisons with simulations, are presented
    corecore