373 research outputs found

    A technique for automatically reprogramming an embedded Linux system

    Get PDF
    This paper presents a method used to automatically reprogram an embedded Linux system using a USB (Universal Serial Bus) pen drive. New software stored on the pen drive can be automatically transferred and installed into the embedded Linux System on insertion of the pen drive. The technique was developed with the aim of creating a multi-purpose device that could be easily reprogrammed at will. This device would then allow a user to bypass a computer in order to transfer data between USB peripherals.peer-reviewe

    Data-driven cross-talk modeling of beam losses in LHC collimators

    Get PDF
    The Large Hadron Collider at CERN is equipped with a collimation system to intercept potentially dangerous beam halo particles before they damage its sensitive equipment. The collimator settings are determined following a beam-based alignment procedure, in which the collimator jaws are moved towards the beam until losses appear in the beam loss monitors. When the collimator reaches the beam envelope, beam losses propagate mainly in the direction of the beam and are, therefore, also observed by other nearby beam loss monitors. This phenomenon is known as cross talk. Due to this, collimators are aligned sequentially to be able to identify which losses are generated by which collimator, such that any cross talk across beam loss monitors positioned close to each other is avoided. This paper seeks to quantify the levels of cross-talk observed by beam loss monitors when multiple collimators are moving, to be able to determine the actual beam loss signals generated by their corresponding collimators. The results obtained successfully predicted the amount of cross-talk observed for each of the cases tested in this study. This was then extended to predict loss map case studies and the proton impacts at each collimator by comparing them to simulations.peer-reviewe

    Operational results with fast automatic beam-based LHC collimator alignment

    Get PDF
    The CERN Large Hadron Collider (LHC) is the largest and highest-energy particle accelerator ever built. It is designed to collide particles at a centre-of-mass energy of 14 TeV to explore the fundamental forces and constituents of matter. Due to the potentially destructive high-energy particle beams, with a total design energy of 362 MJ, the collider is equipped with a series of machine protection systems. The beam cleaning or collimation system is designed to passively intercept and absorb particles at large amplitudes. The cleaning efficiency depends heavily on the accurate positioning of the jaws with respect to the beam trajectory. Beam-based collimator alignment is currently the only feasible technique that can be used to determine the beam centre and beam size at the collimator locations. If the alignment is performed without any automation, it can require up to 30 hours to complete for all collimators. This reduces the beam time available for physics experiments. This article provides a brief recap of the algorithms and software developed to automate and speed up the alignment procedure, and presents the operational results achieved with fast automatic beam-based alignment in the 2011-2013 LHC runs.peer-reviewe

    Collimator settings generation, management and verification

    Get PDF
    Different collimator settings are required throughout the LHC operational cycle following the evolution of key beam parameters like energy, orbit and β-functions. Beam-based alignment is used to determine the beam centers and beam sizes at the collimators at discrete times in the cycle, such as injection, flat-top and collisions. These parameters are then used to generate setting functions for the collimator positions and interlock limits. An overview of the settings generation, management and verification cycle is presented, and potential error scenarios in the settings generation are identified. Improvements foreseen for the post LS1 operation are discussed. The present collimator status monitoring system is reviewed with suggestions for improvement. The role of MAD-X online is discussed. Finally, the results and current status towards maximizing the potential of the embedded-BPM collimators that will be installed in 18 collimator slots during LS1 is presented, including the tested automatic alignment procedure, software interlocks and orbit monitoring.peer-reviewe

    Machine learning applied at the LHC for beam loss pattern classification

    Get PDF
    Beam losses at the LHC are constantly monitored because they can heavily impact the performance of the machine. One of the highest risks is to quench the LHC superconducting magnets in the presence of losses leading to a long machine downtime to recover cryogenic conditions. Smaller losses are more likely to occur and have an impact on the machine performance, reducing the luminosity production or reducing the lifetime of accelerator systems due to radiation effects, such as magnets. Understanding the characteristics of the beam loss, such as the beam and the plane, is crucial to correct them. Regularly during the year, dedicated loss map measurements are performed to validate the beam halo cleaning of the collimation system. These loss maps have the particular advantage that they are performed in well controlled conditions and can therefore be used by a machine learning algorithm to classify the type of losses during the LHC machine cycle. This study shows the result of the beam loss classification and its retrospective application to beam loss data from the 2017 run.peer-reviewe

    Gingival reactive lesions in orally rehabilitated patients by free revascularized flap

    Get PDF
    The aim is to discuss four cases of gingival reactive hyperplastic lesions in patients with a history of excision of oral neoplastic lesions and rehabilitation by a free revascularized flap of the iliac crest. One female and 3 male patients were referred due to the presence of exophytic lesions at the rehabilitated sites. The clinical examination revealed that the poor oral hygiene was the common trigger factor in all the cases, in addition to trauma from the upper left second molar in the first case, pericoronitis related to a partially erupted lower right third molar in the third case, and poor stability of an upper removable partial denture in the fourth case. All the cases were subjected to elimination of these suspected triggering factors, exclusion of dysplasia, excisional biopsy by CO2 laser, and five follow-up visits. The histological examination of all the cases confirmed the diagnosis of pyogenic granuloma. These presented cases suggest that the limitations in oral functions and maintaining the oral hygiene measures following the free revascularized flap reconstruction surgery probably played a role in the development of gingival reactive hyperplastic lesions with presence of trigger factors such as local trauma, chronic infection, or inadequate prosthesis

    Operational results on the fully automatic LHC collimator alignment

    Get PDF
    The Large Hadron Collider has a complex collimation system installed to protect its sensitive equipment from normal and abnormal beam losses. The collimators are set around the beam following a multistage transverse setting hierarchy. The insertion position of each collimator is established using beam-based alignment techniques to determine the local beam position and rms beam size at each collimator. During previous years, collimator alignments were performed semiautomatically, with collimation experts present to oversee and control the alignment. During run II, a new fully automatic alignment tool was developed and used for collimator alignments throughout 2018. This paper discusses the improvements on the alignment software to automate it using machine learning, whilst focusing on the operational results obtained when testing the new software in the LHC. The alignment tests were conducted with both proton and ion beams, and angular alignments were performed with proton beams. This upgraded software successfully decreased the alignment time by a factor of 3 and made the results more reproducible, which is particularly important when performing angular alignments.peer-reviewe

    Software architecture for automatic LHC collimator alignment using machine learning

    Get PDF
    The Large Hadron Collider at CERN relies on a collimation system to absorb unavoidable beam losses before they reach the superconducting magnets. The collimators are positioned close to the beam in a transverse setting hierarchy achieved by aligning each collimator with a precision of a few tens of micrometres. In previous years, collimator alignments were performed semi-automatically, requiring collimation experts to be present to oversee and control the entire process. In 2018, expert control of the alignment procedure was replaced by dedicated machine learning algorithms, and this new software was used for collimator alignments throughout the year. This paper gives an overview of the software re-design required to achieve fully automatic collimator alignments, describing in detail the software architecture and controls systems involved. Following this successful deployment, this software will be used in the future as the default alignment software for the LHC.peer-reviewe

    Feasibility of hardware acceleration in the LHC orbit feedback controller

    Get PDF
    Orbit correction in accelerators typically make use of a linear model of the machine, called the Response Matrix(RM), that relates local beam deflections to position changes.The RM is used to obtain a Pseudo-Inverse (PI), which is used in a feedback configuration, where positional errors from the reference orbit as measured by Beam Position Monitors (BPMs) are used to calculate the required change in the current flowing through the Closed Orbit Dipoles (CODs).The calculation of the PIs from the RMs is a crucial part in the LHC’s Orbit Feedback Controller (OFC), however in the present implementation of the OFC this calculation is omitted as it takes too much time to calculate and thus is unsuitable in a real-time system. As a temporary solution the LHC operators pre-calculate the new PIs outside the OFC, and then manually upload them to the OFC in advance. In this paper we aim to find a solution to this computational bottleneck through hardware acceleration in order to act automatically and as quickly as possible to COD and/or BPM failures by re-calculating the PIs within the OFC. These results will eventually be used in the renovation of the OFCfor the LHC’s Run 3.peer-reviewe

    Evaluation of flight parameters during approach and landing phases by applying principal component analysis

    Get PDF
    This paper adopts an unsupervised learning technique, Principal Component Analysis (PCA) to analyze flight data. While the flight parameters for a stable approach have been established for a while, the paper reevaluates these flight parameters using PCA for a set of airports across the United States of America. Some flight parameters were found to be more sensitive to some airports. The parameters have been cross-checked with experts in the industry to better interpret their significance.peer-reviewe
    • …
    corecore