222 research outputs found
Conceptual design of hollow electron lenses for beam halo control in the Large Hadron Collider
Collimation with hollow electron beams is a technique for halo control in
high-power hadron beams. It is based on an electron beam (possibly pulsed or
modulated in intensity) guided by strong axial magnetic fields which overlaps
with the circulating beam in a short section of the ring. The concept was
tested experimentally at the Fermilab Tevatron collider using a hollow electron
gun installed in one of the Tevatron electron lenses. Within the US LHC
Accelerator Research Program (LARP) and the European FP7 HiLumi LHC Design
Study, we are proposing a conceptual design for applying this technique to the
Large Hadron Collider at CERN. A prototype hollow electron gun for the LHC was
built and tested. The expected performance of the hollow electron beam
collimator was based on Tevatron experiments and on numerical tracking
simulations. Halo removal rates and enhancements of halo diffusivity were
estimated as a function of beam and lattice parameters. Proton beam core
lifetimes and emittance growth rates were checked to ensure that undesired
effects were suppressed. Hardware specifications were based on the Tevatron
devices and on preliminary engineering integration studies in the LHC machine.
Required resources and a possible timeline were also outlined, together with a
brief discussion of alternative halo-removal schemes and of other possible uses
of electron lenses to improve the performance of the LHC.Comment: 24 pages, 1 table, 10 figure
Collimator settings generation, management and verification
Different collimator settings are required throughout the
LHC operational cycle following the evolution of key beam
parameters like energy, orbit and ÎČ-functions. Beam-based
alignment is used to determine the beam centers and beam
sizes at the collimators at discrete times in the cycle, such
as injection, flat-top and collisions. These parameters are
then used to generate setting functions for the collimator
positions and interlock limits. An overview of the settings generation, management and verification cycle is presented, and potential error scenarios in the settings generation are identified. Improvements foreseen for the post LS1 operation are discussed. The present collimator status monitoring system is reviewed with suggestions for improvement. The role of MAD-X online is discussed.
Finally, the results and current status towards maximizing
the potential of the embedded-BPM collimators that will
be installed in 18 collimator slots during LS1 is presented,
including the tested automatic alignment procedure,
software interlocks and orbit monitoring.peer-reviewe
Anomaly detection for beam loss maps in the Large Hadron Collider
In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.Asian Committee for Future Accelerators (ACFA),The American Physical Society Division of Physics of Beams APS-DB and the United States National Science Foundation (Plasma Physics and Accelerator Science),The International Union of Pure and Applied Physics (IUPAP)peer-reviewe
Bound-free pair production from nuclear collisions and the steady-state quench limit of the main dipole magnets of the CERN Large Hadron Collider
During its Run 2 (2015-2018), the Large Hadron Collider (LHC) operated at
almost twice higher energy, and provided Pb-Pb collisions with an order of
magnitude higher luminosity, than in the previous Run 1. In consequence, the
power of the secondary beams emitted from the interaction points by the
bound-free pair production (BFPP) process increased by a factor ~20, while the
propensity of the bending magnets to quench increased with the higher magnetic
field. This beam power is about 35 times greater than that contained in the
luminosity debris from hadronic interactions and is focused on specific
locations that fall naturally inside superconducting magnets. The risk of
quenching these magnets has long been recognized as severe and there are
operational limitations due to the dynamic heat load that must be evacuated by
the cryogenic system. High-luminosity operation was nevertheless possible
thanks to orbit bumps that were introduced in the dispersion suppressors around
the ATLAS and CMS experiments to prevent quenches by displacing and spreading
out these beam losses. Further, in 2015, the BFPP beams were manipulated to
induce a controlled quench, thus providing the first direct measurement of the
steady-state quench level of an LHC dipole magnet. The same experiment
demonstrated the need for new collimators that are being installed around the
ALICE experiment to intercept the secondary beams in the future. This paper
discusses the experience with BFPP at luminosities very close to the future
High Luminosity LHC (HL-LHC) target, gives results on the risk reduction by
orbit bumps and presents a detailed analysis of the controlled quench
experiment.Comment: 16 pages, 11 figure
Measured and simulated heavy-ion beam loss patterns at the CERN Large Hadron Collider
The Large Hadron Collider (LHC) at CERN pushes forward to new regimes in terms of beam energy and intensity. In view of the combination of very energetic and intense beams together with sensitive machine components, in particular the superconducting magnets, the LHC is equipped with a collimation system to provide protection and intercept uncontrolled beam losses. Beam losses could cause a superconducting magnet to quench, or in the worst case, damage the hardware. The collimation system, which is optimized to provide a good protection with proton beams, has shown a cleaning efficiency with heavy-ion beams which is worse by up to two orders of magnitude. The reason for this reduced cleaning efficiency is the fragmentation of heavy-ion beams into isotopes with a different mass to charge ratios because of the interaction with the collimator material. In order to ensure sufficient collimation performance in future ion runs, a detailed theoretical understanding of ion collimation is needed. The simulation of heavy-ion collimation must include processes in which Pb82+208 ions fragment into dozens of new isotopes. The ions and their fragments must be tracked inside the magnetic lattice of the LHC to determine their loss positions. This paper gives an overview of physical processes important for the description of heavy-ion loss patterns. Loss maps simulated by means of the two tools ICOSIM [1,2] and the newly developed STIER (SixTrack with Ion-Equivalent Rigidities) are compared with experimental data measured during LHC operation. The comparison shows that the tool STIER is in better agreement.peer-reviewe
Classification of LHC beam loss spikes using support vector machines
The CERN Large Hadron Collider's (LHC) collimation system is the most complex beam cleaning system ever designed. It requires frequent setups to determine the beam centres and beam sizes at the 86 collimator positions. A collimator jaw is aligned to the beam halo when a clear beam loss spike is detected on a Beam Loss Monitor (BLM) downstream of the collimator. This paper presents a technique for identifying such clear loss spikes with the aid of Support Vector Machines. The training data was gathered from setups held during the first three months of the 2011 LHC run, and the model was tested with data from a machine development period.peer-reviewe
Multi-turn losses and cleaning
In the LHC all multi-turn losses should occur at the collimators in the cleaning insertions. The cleaning inefficiency
(leakage rate) is the figure of merit to describe the performance. In combination with the quench limit of the superconducting magnets and the instantaneous life time of
the beam this defines the cleaning dependent beam intensity limit of the LHC. In addition, limits can arise from
radiation-induced effects, like radiation damage and radation to electronics. In this paper the used collimator settings, the required setup time, the reliability of collimation
(all multi-turn losses at collimators), and the achieved proton/ion cleaning inefficiency are discussed. Observed and
expected losses are compared. The performance evolution
during the months of operation is reviewed. In addition,
the peak losses during high intensity runs, losses caused
by instabilities, and the resulting beam life times are discussed. Taking the observations into account the intensity
reach with collimation at 3.5 and 4 TeV is reviewed.peer-reviewe
First ion collimation commissioning results at the LHC
First commissioning of the LHC lead ion beams to 1.38 A TeV beam energy was successfully achieved in November 2010. Ion collimation has been predicted to be less efficient than for protons at the LHC, because of the complexity of the physical processes involved: nuclear fragmentation and electromagnetic dissociation in the primary collimators creating fragments with a wide range of Z/A ratios, that are not intercepted by the secondary collimators but lost in the dispersion suppressor sections of the ring. In this article we present first comparisons of measured loss maps with theoretical predictions from simulation runs with the ICOSIM code. An extrapolation to define the ultimate intensity limit for Pb beams is attempted. The scope of possible improvements in collimation efficiency coming from the installation of new collimators in the cold dispersion suppressors and combined betatron and momentum cleaning is also explored.Ministerio de Ciencia e Innovacion - Gobierno de Espana,Ayuntamiento de San Sebastian,Gobierno Vasco,Diputacion Foral de Gipuzkoa,San Sebastian Turismo - Convention Bureaupeer-reviewe
End-of-fill study on collimator tight settings
In 2010 and 2011 the collimation system has been operated with relaxed settings, i.e. with retractions between
different collimator families larger than the nominal settings that provide optimum cleaning. This configuration ensured a sufficient cleaning performance at 3.5 TeV while allowing larger tolerances on orbit control.
Tighter collimator settings were proposed to push the cleaning performance and to allow larger orbit margins
between TCDQ dump protection and tertiary collimators. With the same margins as with the relaxed settings, the ÎČâ could be reduced. After having verified with beam that the cleaning is improved as expected,
the feasibility of tighter collimator settings must be addressed with high stored intensity. For this purpose, an
end-of-fill study was proposed after a standard physics fill with 1380 bunches nominal bunches at 3.5 TeV, for
a total stored energy of 95 MJ. During this test, primary and secondary collimators were moved to tight settings after about 8 hours of stable physics conditions in all experiments. This note summarises the operational
procedure followed and the results of beam measurements during this study.peer-reviewe
Comparison of LHC collimator beam-based alignment to BPM-interpolated centers
The beam centers at the Large Hadron Collider collimators are determined by beam-basedalignment, where both jaws of a collimator are moved in separately until a loss spike isdetected on a Beam Loss Monitor downstream. Orbit drifts of more than a few hundredmicrometers cannot be tolerated, as they would compromise the performance of thecollimation system. Beam Position Monitors (BPMs) are installed at various locations aroundthe LHC ring, and a linear interpolation of the orbit can be obtained at the collimatorpositions. In this paper, the results obtained from beam-based alignment are compared withthe orbit interpolated from the BPM data throughout the 2011 and 2012 LHC proton runs.Louisiana State University (LSU),U.S. Department of Energy, Office of Science,COSYLAB,DIMTEL,Muons, Inc.peer-reviewe
- âŠ