2,260 research outputs found
Commissioning, Operation and Performance of the CMS Drift Tube Chambers
The CMS muon spectrometer, designed to trigger, identify, reconstruct and measure muons with high efficiency and accuracy, is equipped with Drift Tube chambers (DT) in the barrel region. The DT system has been fully commissioned using cosmic muons with and without magnetic field, and during months of cosmic data taking has provided millions of triggers to the rest of the CMS detector. This contribution will describe the challenges in the operation of the DT system, including calibration procedures, monitoring and reconstruction performance, and the result of the analysis of the collected cosmic data
Offline Calibration Procedure of the Drift Tube Detectors
A detailed description of the calibration of the DT local reconstruction algorithm is reported. After inter-channel synchronization has been verified through the appropriate hardware procedure, the time pedestal can be extracted directly from the distribution of the digi-times. Further corrections for time-of-flight and time of signal propagation are applied as soon as the three-dimensional hit position within the chamber is known. The different effects of the time pedestal miscalibration on the two main hit reconstruction algorithms are shown. The drift velocity calibration algorithm is based on the meantimer technique and different meantimer relations for different track angles and patterns of hit cells are used. This algorithm can also be used to determine the uncertainty of the reconstructed hit position
Local Muon Reconstruction in the Drift Tube Detectors
This note describes the local reconstruction in the Drift Tube subdetector of the CMS muon subsystem. The local reconstruction is the sequence of steps leading from the TDC measurements to reconstructed three-dimensional segments inside each DT chamber. These segments are the input to the muon track reconstruction. This note updates and supersedes CMS NOTE 2002/04
Anomaly Detection With Conditional Variational Autoencoders
International audienceExploiting the rapid advances in probabilistic inference, in particular variational Bayes and variational au-toencoders (VAEs), for anomaly detection (AD) tasks remains an open research question. Previous works argued that training VAE models only with inliers is insufficient and the framework should be significantly modified in order to discriminate the anomalous instances. In this work, we exploit the deep conditional variational autoencoder (CVAE) and we define an original loss function together with a metric that targets hierarchically structured data AD. Our motivating application is a real world problem: monitoring the trigger system which is a basic component of many particle physics experiments at the CERN Large Hadron Collider (LHC). In the experiments we show the superior performance of this method for classical machine learning (ML) benchmarks and for our application
Measurement of Drift Velocity in the CMS Barrel Muon Chambers at the CMS Magnet Test Cosmic Challenge
This note reports the results of the analysis performed on the data collected by the CMS Barrel Muon system during the Magnet Test-Cosmic Challenge, aimed to study the Drift Tube chambers behavior at the nominal value of the CMS magnetic field. In particular, the analysis is devoted to the study of the drift velocity in the various equipped regions of the apparatus. It is shown that the drift velocity is significantly affected by the presence of a residual magnetic field in the chamber volume only in the innermost stations, MB1, of Wheel+2; where the maximal variation inside the chamber is of 4 percent, which does not prevent a good functionality of the DT trigger even in this most critical region
Improving data quality monitoring via a partnership of technologies and resources between the CMS experiment at CERN and industry
The Compact Muon Solenoid (CMS) experiment dedicates significant effort to assess the quality of its data, online and offline. A real-time data quality monitoring system is in place to spot and diagnose problems as promptly as possible to avoid data loss. The a posteriori evaluation of processed data is designed to categorize it in terms of their usability for physics analysis. These activities produce data quality metadata. The data quality evaluation relies on a visual inspection of the monitoring features. This practice has a cost in term of human resources and is naturally subject to human arbitration. Potential limitations are linked to the ability to spot a problem within the overwhelming number of quantities to monitor, or to the lack of understanding of detector evolving conditions. In view of Run 3, CMS aims at integrating deep learning technique in the online workflow to promptly recognize and identify anomalies and improve data quality metadata precision. The CMS experiment engaged in a partnership with IBM with the objective to support, through automatization, the online operations and to generate benchmarking technological results. The research goals, agreed within the CERN Openlab framework, how they matured in a demonstration applic tion and how they are achieved, through a collaborative contribution of technologies and resources, are presented
Beta-Blocker Use in Older Hospitalized Patients Affected by Heart Failure and Chronic Obstructive Pulmonary Disease: An Italian Survey From the REPOSI Register
Beta (β)-blockers (BB) are useful in reducing morbidity and mortality in patients with heart failure (HF) and concomitant chronic obstructive pulmonary disease (COPD). Nevertheless, the use of BBs could induce bronchoconstriction due to β2-blockade. For this reason, both the ESC and GOLD guidelines strongly suggest the use of selective β1-BB in patients with HF and COPD. However, low adherence to guidelines was observed in multiple clinical settings. The aim of the study was to investigate the BBs use in older patients affected by HF and COPD, recorded in the REPOSI register. Of 942 patients affected by HF, 47.1% were treated with BBs. The use of BBs was significantly lower in patients with HF and COPD than in patients affected by HF alone, both at admission and at discharge (admission, 36.9% vs. 51.3%; discharge, 38.0% vs. 51.7%). In addition, no further BB users were found at discharge. The probability to being treated with a BB was significantly lower in patients with HF also affected by COPD (adj. OR, 95% CI: 0.50, 0.37-0.67), while the diagnosis of COPD was not associated with the choice of selective β1-BB (adj. OR, 95% CI: 1.33, 0.76-2.34). Despite clear recommendations by clinical guidelines, a significant underuse of BBs was also observed after hospital discharge. In COPD affected patients, physicians unreasonably reject BBs use, rather than choosing a β1-BB. The expected improvement of the BB prescriptions after hospitalization was not observed. A multidisciplinary approach among hospital physicians, general practitioners, and pharmacologists should be carried out for better drug management and adherence to guideline recommendations
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
- …