49 research outputs found

    Luminosity Scans for Beam Diagnostics

    Full text link
    A new type of fast luminosity separation scans ("Emittance Scans") was introduced at the CERN Large Hadron Collider (LHC) in 2015. The scans were performed systematically in every fill with full-intensity beams in physics production conditions at the Interaction Point (IP) of the Compact Muon Solenoid (CMS) experiment. They provide both emittance and closed orbit measurements at a bunch-by-bunch level. The precise measurement of beam-beam closed orbit differences allowed a direct, quantitative observation of long-range beam-beam PACMAN effects, which agrees well with numerical simulations from an improved version of the TRAIN code

    Status of JMAD, the Java API for MADX

    Get PDF
    MadX (Methodical Accelerator Design) is the de-facto standard software for modeling accelerator lattices at CERN. This feature-rich software package is implemented and still maintained in the programming languages C and FORTRAN. Nevertheless the controls environment of modern accelerators at CERN, e.g. of the LHC, is dom- inated by Java applications. A lot of these applications, for example, for lattice measurement and fitting, require a close interaction with the numerical models, which are all defined by the use of the proprietary MadX scripting lan- guage. To close this gap an API to MadX for the Java pro- gramming language (JMad) was developed. JMad was first presented to the public about one year ago. In the mean- time, a number of improvements were done, and additional MadX features (e.g., tracking) were made available for Java applications. Additionally, the graphical user interface was improved and JMad was released as open source software. This paper describes the current status and some new fea- tures of the project, as well as some usage examples

    Electromagnetic meson form factor from a relativistic coupled-channel approach

    Full text link
    Point-form relativistic quantum mechanics is used to derive an expression for the electromagnetic form factor of a pseudoscalar meson for space-like momentum transfers. The elastic scattering of an electron by a confined quark-antiquark pair is treated as a relativistic two-channel problem for the qqˉeq\bar{q}e and qqˉeγq\bar{q}e\gamma states. With the approximation that the total velocity of the qqˉeq\bar{q}e system is conserved at (electromagnetic) interaction vertices this simplifies to an eigenvalue problem for a Bakamjian-Thomas type mass operator. After elimination of the qqˉeγq\bar{q}e\gamma channel the electromagnetic meson current and form factor can be directly read off from the one-photon-exchange optical potential. By choosing the invariant mass of the electron-meson system large enough, cluster separability violations become negligible. An equivalence with the usual front-form expression, resulting from a spectator current in the q+=0q^+=0 reference frame, is established. The generalization of this multichannel approach to electroweak form factors for an arbitrary bound few-body system is quite obvious. By an appropriate extension of the Hilbert space this approach is also able to accommodate exchange-current effects.Comment: 30 pages, 5 figure

    Toolchain for Online Modeling of the LHC

    Get PDF
    The control of high intensity beams in a high energy, su- perconducting machine with complex optics like the CERN Large Hadron Collider (LHC) is challenging not only from the design aspect but also for operation towards physics production. To support the LHC beam commissioning, ef- forts were devoted in the design and implementation of a software infrastructure aimed at using the computing power of the beam dynamics code M AD - X in the framework of the JAVA-based LHC control and measurement environment. Alongside interfaces to measurement data as well as to set- tings of the control system, the best knowledge of machine aperture and optic models is provided. In this paper, we will present the status of the toolchain and illustrate how it has been used during commissioning and operation of the LHC. Possible future implementations will be discussed

    LHC orbit system, performance and stability

    No full text
    During the LHC run period in 2009 the Orbit system proved to be very reliable. In the following the analysis results of the first data collected during various beam processes (stable periods, ramp and squeeze) are shown and several correction alternatives are proposed. The commissioning status of beam positions monitors, orbit correctors and the real time feedback system is summarized and open issues are listed at the end

    Novel Concepts for Optimization of the CERN Large Hadron Collider Injection Lines.

    No full text
    The Large Hadron Collider (LHC) is presently the particle accelerator with the highest center of mass energy in the world and is for that reason the most promising instrument for particle physics discoveries in the near future. The transfer lines TI2 and TI8 which transfer the beam from the last pre-accelerator, the Super Proton Synchrotron (SPS), to the LHC are with a total length of about 6 km the longest ones in the world, which makes it necessary to do optics matching with high precision. Tests between 2004 and 2008 revealed several, previousely unpredicted, effects in these lines: An assymetry in betatron phase between the two transverse planes, a dispersion mismatch at the injection point from the transfer lines to the LHC and unexpectedly strong transverse coupling at the same location. In this thesis, we introduce the methods and tools that we developed to investigate these discrepancies. We describe the analysis of the available data, measurements of the transfer line optics and the calculation of optics corrections. Further we show that the optics mismatch can be explained from a sextupolar field component in the injection main bends, which we deduced from beam measurements and later was confirmed by numerical magnet simulations. Finally, we describe the measures taken to improve the underlaying numerical models and demonstrate the very good measurement-model agreement for kick response measurements and dispersion up to second order, which guarantees an excellent beam quality in the LHC

    LHC Online Chromaticity Measurement - Experience after One Year of Operation

    No full text
    Hardware and infrastructural requirements to measure chromaticity in the LHC were available since the beginning. However, the calculation of the chromaticity was mostly made offline. This gap was closed in 2015 by the development of a dedicated application for the LHC control room, which takes the measured data and produces estimates for the chromaticity values immediately online and allows to correct chroma and tune accordingly. This tool proved to be essential during commissioning as well as during every injection phase of the LHC. It became particularly important during the intensity ramp up with 25ns where good control of the chromaticity became crucial at injection. This paper describes the concepts and algorithms behind this tool, the experience gained as well as further plans for improvements

    Testing the Untestable: A Realistic Vision of Fearlessly Testing (Almost) Every Single Accelerator Component Without Beam and Continuous Deployment Thereof

    No full text
    Whenever a bug in some piece of software or hardware stops beam operation, loss of time is rarely negligible and the cost (either in lost luminosity or real financial one) might be significant. Optimization of the accelerator availability is a strong motivation to avoid such kind of issues. Still, even at large accelerator labs like CERN, release cycles of many accelerator components are managed in a "deploy and pray" manner. In this paper we will give a short general overview on testing strategies used commonly in software development projects and illustrate their application on accelerator components, both hardware and software. Finally, several examples of CERN systems will be shown on which these techniques were or will be applied (LHC Beam-Based Feedbacks and LHC Luminosity Server) and describe why it is worth doing so

    Operational Experience with Luminosity Scans for Beam Size Estimation in 2016 LHC Proton Physics Operation

    No full text
    Luminosity scans were regularly performed at the CERN Large Hadron Collider (LHC) as of 2015 as a complementary method for measuring the beam size. The CMS experiment provides bunch-by-bunch luminosities at sufficient rates to allow evaluation of bunch-by-bunch beam sizes, and the scans are performed in the horizontal and vertical plane separately. Closed orbit differences between bunches can also be derived by this analysis. During 2016 LHC operation, these scans were also done in an automated manner on a regular basis, and the analysis was improved to significantly reduce the systematic uncertainty, especially in the crossing plane. This contribution first highlights the recent improvements to the analysis and elaborates on their impact. The measured beam sizes during 2016 proton physics operation are then shown and compared to measurements from synchrotron light telescopes and estimates based on the absolute luminosities of the LHC experiments

    Test-driven software upgrade of the LHC beam-based feedback systems

    No full text
    The beam-based feedback system is essential for the operation of the LHC. It comprises two C++ servers: a FESA-based (framework for real-time systems developed at CERN) acquisition and configuration proxy, and a non FESA-based controller which sanitises the acquisition data and feeds it to multiple real-time feedback algorithms (orbit control, radialloop control and tune control) ensuring a stable orbit of the LHC's beams. Responsibility for the further development and maintenance of the servers was recently transferred to a new team, who have made considerable efforts to document the existing system as well as improve its operational reliability, performance, maintainability and compliance with CERN's software and operational standards. Software changes are accompanied by rigorous unit-testing with future releases tested outside the operational environment, thus minimizing the potential for beam downtime. This approach has proven very effective during re-commissioning for LHC's run 2, where the systems underwent significant changes. In a bid to homogenize operational procedures for configuring LHC systems, a demand to improve the real-time configuration of the system's feedback references and optics was identified. To replace the existing ad-hoc method of real-time configuration, a new waveform-based server, pre-configured with sequences of N-dimensional values versus time, autonomously ensures that the system is re-configured at precisely the correct time. This paper describes the design choices, software architecture, integration and preliminary testing of the new waveform-based server. In particular, considerable effort was put into reducing the impact of changing already established and tested behaviour
    corecore