11 research outputs found

    A Sequencer for the LHC ERA

    Get PDF
    The Sequencer is a high level software application that helps operators and physicists to commission and control the LHC. It is an important operational tool for the LHC and a core part of the control system that interacts with all LHC sub-systems. This paper describes the architecture and design of the sequencer and illustrates some innovative parts of the implementation, based on modern Java technology

    The LHC Post Mortem Analysis Framework

    Get PDF
    The LHC with its unprecedented complexity and criticality of beam operation will need thorough analysis of data taken from systems such as power converters, interlocks and beam instrumentation during events like magnet quenches and beam loss. The causes of beam aborts or in the worst case equipment damage have to be revealed to improve operational procedures and protection systems. The correct functioning of the protection systems with their required redundancy has to be verified after each such event. Post mortem analysis software for the control room has been prepared with automated analysis packages in view of the large number of systems and data volume. This paper recalls the requirements for the LHC Beam Post Mortem System (PM) and the necessity for highly reliable data collection. It describes in detail the redundant architecture for data collection as well as the chosen implementation of a multi-level analysis framework, allowing for automated analysis and qualification of a beam dump event based on expert provided analysis modules. It concludes with an example of the data taken during first beam tests in September 2008 with a first version of the system

    THE LHC POST MORTEM ANALYSIS FRAMEWORK

    Get PDF
    Abstract The LHC with its unprecedented complexity and criticality of beam operation will need thorough analysis of data taken from systems such as power converters, interlocks and beam instrumentation during events like magnet quenches and beam loss. The causes of beam aborts or in the worst case equipment damage have to be revealed to improve operational procedures and protection systems. The correct functioning of the protection systems with their required redundancy has to be verified after each such event. Post mortem analysis software for the control room has been prepared with automated analysis packages in view of the large number of systems and data volume. This paper recalls the requirements for the LHC Beam Post Mortem System (PM) and the necessity for highly reliable data collection. It describes in detail the redundant architecture for data collection as well as the chosen implementation of a multi-level analysis framework, allowing for automated analysis and qualification of a beam dump event based on expert provided analysis modules. It concludes with an example of the data taken during first beam tests in September 2008 with a first version of the system

    Procedures to be followed for new XPOC release

    No full text
    The eXternal Post Operational Check (XPOC) of the beam dumping system is automatically performed after each LHC beam dump. A failure of the automatic XPOC check will stop injection of the beams. This note describes the procedures to be followed in case a new release of the XPOC or related Post Mortem software is made

    LSA - the High Level Application Software of the LHC - and Its Performance During the First Three Years of Operation

    No full text
    The LHC Software Architecture (LSA) [1] project was started in 2001 with the aim of developing the high level core software for the control of the LHC accelerator

    Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    No full text
    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins before integration into the application, etc

    External Post-operational Checks for the LHC Beam Dumping System

    No full text
    The LHC Beam Dumping System (LBDS) is a critical part of the LHC machine protection system. After every LHC beam dump action the various signals and transient data recordings of the beam dumping control systems and beam instrumentation measurements are automatically analysed by the eXternal Post-Operational Checks (XPOC) system to verify the correct execution of the dump action and the integrity of the related equipment. This software system complements the LHC machine protection hardware, and has to ascertain that the beam dumping system is 'as good as new' before the start of the next operational cycle. This is the only way by which the stringent reliability requirements can be met. The XPOC system has been developed within the framework of the LHC "Post-Mortem" system, allowing highly dependable data acquisition, data archiving, live analysis of acquired data and replay of previously recorded events. It is composed of various analysis modules, each one dedicated to the analysis of measurements coming from specific equipment. This paper describes the global architecture of the XPOC system and gives examples of the analyses performed by some of the most important analysis modules. It explains the integration of the XPOC into the LHC control infrastructure along with its integration into the decision chain to allow proceeding with beam operation. Finally, it discusses the operational experience with the XPOC system acquired during the first years of LHC operation, and illustrates examples of internal system faults or abnormal beam dump executions which it has detected

    Automated Execution and Tracking of the LHC Commissioning Tests

    No full text
    To ensure the correct operation and prevent system failures, which can lead to equipment damage in the worst case, all critical systems in the Large Hadron Collider (LHC), among them the superconducting circuits, have to be tested thoroughly during dedicated commissioning phases after each intervention. In view of the around 7,000 individual tests to be performed each year after a Christmas stop, a lot of effort was already put into the automation of these tests at the beginning of LHC hardware commissioning in 2005, to assure the dependable execution and analysis of these tests. To further increase the productivity during the commissioning campaigns and to enforce a more consistent workflow, the development of a dedicated testing framework was launched. This new framework is designed to schedule and track the automated tests for all systems of the LHC and will also be extendable, e.g., to beam commissioning tests. This is achieved by re-using different, already existing execution frameworks. In this paper, we outline the motivation for this new framework and the related improvements in the commissioning process. Further, we sketch its design and present first experience from the recommissioning campaign in early 2012

    JMAD - Integration of MADX into the Java World

    No full text
    MADX (Methodical Accelerator Design) is the de-facto standard software for modeling accelerator lattices at CERN. This feature-rich software package is implemented and still maintained in the programming languages C and FORTRAN. Nevertheless the controls environment of modern accelerators at CERN, e.g. of the LHC, is dominated by JAVA applications. A lot of these applications, for example for lattice measurement and fitting, require a close interaction with the numerical models, which are all defined by the use of the proprietary MADX scripting language. To close this gap an API to MADX for the JAVA programming language (JMAD) was developed. Already the current implementation provides access to a large subset of the MADX capabilities (e.g. twiss-calculations, matching or querying and setting arbitrary model parameters) without any necessity to define the models in yet another environment. This paper describes shortly the design of this project as well as the current status and some usage examples

    Using a Java Embedded DSL for LHC Test Analysis

    No full text
    The Large Hadron Collider (LHC) at CERN requires thousands of systems to work in close cooperation
    corecore