49 research outputs found

    IMPLEMENTATION OF LUMINOSITY LEVELING BY BETATRON FUNCTION ADJUSTMENT AT THE LHC INTERACTION POINTS

    Get PDF
    Abstract Growing expectations for integrated luminosity during upcoming LHC runs introduce new challenges for LHC beam operation in the scope of online luminosity control. Because some LHC experiments are limited in the maximum event rates, their luminosity is leveled to a constant value. Various techniques may be used for luminosity leveling, changing the betatron function at the interaction point is one of them. This paper explains the main operational requirements of a betatron function leveling scheme for the upcoming LHC run. Issues concerning the beam optics, orbits and collimator settings are discussed. The proposed architecture for control system integration will be discussed. A few operational scenarios with different beam configurations foreseen for the next LHC run will be presented. LUMINOSITY An important parameter that affects the quality of the recorded luminosity at the LHC is the event pile-up, the number of simultaneous particle interactions during one bunch crossing. A high event pile-up complicates the physics analysis and degrades the data quality for certain types of physics channels. The event pile-up µ is directly proportional to the luminosity per bunch crossing L bb , µ = L bb × σ P , where σ P is the total cross section for pp interactions at the LHC, σ P = 70 − 85 mbarn. The total luminosity L p is given by L p = k L bb where k is the number of bunch crossings per turn. The bunch pair luminosity for round beams at an interaction point can be written as where N stands for number of particles in the bunch, ε N for the normalized emittance and β * for the betatron function at the interaction point. f is the revolution frequency and γ the relativistic factor. F is a correction factor for the crossing angle. For round beams ε N and β * are identical for both transverse planes. d is the transverse offset (separation) between the colliding beams. The transverse separation d and the betatron function β* can be seen as a way for control luminosity. LHC RUN 2 BEAM PROJECTIONS After the long shutdown the LHC will restart beam operation in 2015 at an energy of 6.5 TeV. The LHC has two high luminosity experiments ATLAS and CMS that are installed at interaction points 1 and 5 (IR1 and IR5). Those experiments can cope with a maximum average pile-up of 50 and a time-averaged pile-up of 30 to 40. The LHCb experiment in IR8 on the other hand will operate at a maximum pile-up of µ = 1.6. Luminosity leveling is required for the LHCb experiment for all scenarios, while for the high luminosity experiments only the 50 ns scenario definitely requires leveling. With 25 ns some leveling is required in IR1 and IR5 only for the brightest beams. For the LHC luminosity upgrade HL-LHC (from 2023) [1] luminosity leveling by β* is part of the operational baseline

    Direct Observation of a Superconducting Vortex Diode

    Full text link
    The interplay between magnetism and superconductivity can lead to unconventional proximity and Josephson effects. A related phenomenon that has recently attracted considerable attention is the superconducting diode effect, in which a non-reciprocal critical current emerges. Although superconducting diodes based on superconducting/ferromagnetic (S/F) bilayers were demonstrated more than a decade ago, the precise underlying mechanism remains unclear. While not formally linked to this effect, the Fulde-Ferrell-Larkin-Ovchinikov (FFLO) state is a plausible mechanism, due to the 2-fold rotational symmetry breaking caused by the finite center-of-mass-momentum of the Cooper pairs. Here, we directly observe, for the first time, a tunable superconducting vortex diode in Nb/EuS (S/F) bilayers. Based on our nanoscale SQUID-on-tip (SOT) microscope and supported by in-situ transport measurements, we propose a theoretical model that captures our key results. Thus, we determine the origin for the vortex diode effect, which builds a foundation for new device concepts

    Gamma Factory at CERN – novel research tools made of light

    Get PDF
    We discuss the possibility of creating novel research tools by producing and storing highly relativistic beams of highly ionised atoms in the CERN accelerator complex, and by exciting their atomic degrees of freedom with lasers to produce high-energy photon beams. Intensity of such photon beams would be by several orders of magnitude higher than offered by the presently operating light sources, in the particularly interesting gamma-ray energy domain of 0.1-400 MeV. In this energy range, the high-intensity photon beams can be used to produce secondary beams of polarised electrons, polarised positrons, polarised muons, neutrinos, neutrons and radioactive ions. New research opportunities in a wide domain of fundamental and applied physics can be opened by the Gamma Factory scientific programme based on the above primary and secondary beams.Comment: 12 pages; presented by W. Placzek at the XXV Cracow Epiphany Conference on Advances in Heavy Ion Physics, 8-11 January 2019, Cracow, Polan

    Leveling options and strategy

    No full text
    This paper gives an overview of possibilities for luminos-ity leveling in the LHC Run 2. Different scenarios together with detailed proposals will be presented. Since luminosity leveling by transverse offset was operationally proven part of this paper will describe in detail how leveling of luminosity will be done using β* adjustment on the example of LHCb

    Implementation of luminosity leveling by betatron function adjustment at the LHC interaction points

    No full text
    Growing expectations for integrated luminosity during upcoming LHC runs introduce new challenges for LHC beam operation in the scope of online luminosity control. Because some LHC experiments are limited in the maximum event rates, their luminosity is leveled to a constant value. Various techniques may be used for luminosity leveling, changing the betatron function at the interaction point is one of them. This paper explains the main operational requirements of a betatron function leveling scheme for the upcoming LHC run. Issues concerning the beam optics, orbits and collimator settings are discussed. The proposed architecture for control system integration will be discussed. A few operational scenarios with different beam configurations foreseen for the next LHC run will be presented

    Concept and Prototype for a Distributed Analysis Framework for the LHC Machine Data

    No full text
    The Large Hadron Collider (LHC) at CERN produces more than 50 TB of diagnostic data every year, shared between normal running periods as well as commissioning periods

    Beam Losses Through the LHC Operational Cycle in 2012

    No full text
    We review the losses through the nominal LHC cycle for physics operation in 2012. The loss patterns are studied and categorized according to timescale, distribution, time in the cycle, which bunches are affected, whether coherent or incoherent. Possible causes and correlations are identified, e.g. to machine parameters or instability signatures. A comparison with losses in the previous years of operation is also shown

    Beam losses through the cycle

    No full text
    We review the losses throughout the nominal LHC cycle for physics operation in 2012 and for a few fills in 2011. The loss patterns are studied and categorized according to timescale, distribution, time in the cycle, which bunches are affected. Possible causes and correlations are identified, e.g. to machine parameters or BBQ amplitude signal
    corecore