1,206 research outputs found

    Quantum-inspired Machine Learning on high-energy physics data

    Get PDF
    Tensor Networks, a numerical tool originally designed for simulating quantum many-body systems, have recently been applied to solve Machine Learning problems. Exploiting a tree tensor network, we apply a quantum-inspired machine learning technique to a very important and challenging big data problem in high energy physics: the analysis and classification of data produced by the Large Hadron Collider at CERN. In particular, we present how to effectively classify so-called b-jets, jets originating from b-quarks from proton-proton collisions in the LHCb experiment, and how to interpret the classification results. We exploit the Tensor Network approach to select important features and adapt the network geometry based on information acquired in the learning process. Finally, we show how to adapt the tree tensor network to achieve optimal precision or fast response in time without the need of repeating the learning process. These results pave the way to the implementation of high-frequency real-time applications, a key ingredient needed among others for current and future LHCb event classification able to trigger events at the tens of MHz scale.Comment: 13 pages, 4 figure

    Preliminary Report on the Study of Beam-Induced Background Effects at a Muon Collider

    Full text link
    Physics at a multi-TeV muon collider needs a change of perspective for the detector design due to the large amount of background induced by muon beam decays. Preliminary studies, based on simulated data, on the composition and the characteristics of the particles originated from the muon decays and reaching the detectors are presented here. The reconstruction performance of the physics processes H→bbˉH\to b\bar b and Z→bbˉZ\to b\bar b has been investigated for the time being without the effect of the machine induced background. A preliminary study of the environment hazard due to the radiation induced by neutrino interactions with the matter is presented using the FLUKA simulation program

    Quantum Machine Learning for bb-jet charge identification

    Full text link
    Machine Learning algorithms have played an important role in hadronic jet classification problems. The large variety of models applied to Large Hadron Collider data has demonstrated that there is still room for improvement. In this context Quantum Machine Learning is a new and almost unexplored methodology, where the intrinsic properties of quantum computation could be used to exploit particles correlations for improving the jet classification performance. In this paper, we present a brand new approach to identify if a jet contains a hadron formed by a bb or bˉ\bar{b} quark at the moment of production, based on a Variational Quantum Classifier applied to simulated data of the LHCb experiment. Quantum models are trained and evaluated using LHCb simulation. The jet identification performance is compared with a Deep Neural Network model to assess which method gives the better performance

    CDF experience with monte carlo production using LCG grid

    Get PDF
    The upgrades of the Tevatron collider and CDF detector have considerably increased the demand on computing resources, in particular for Monte Carlo production. This has forced the collaboration to move beyond the usage of dedicated resources and start exploiting the Grid. The CDF Analysis Farm (CAF) model has been reimplemented into LcgCAF in order to access Grid resources by using the LCG/EGEE middleware. Many sites in Italy and in Europe are accessed through this portal by CDF users mainly to produce Monte Carlo data but also for other analysis jobs. We review here the setup used to submit jobs to Grid sites and retrieve the output, including CDF-specific configuration of some Grid components. We also describe the batch and interactive monitor tools developed to allow users to verify the jobs status during their lifetime in the Grid environment. Finally we analyze the efficiency and typical failure modes of the current Grid infrastructure reporting the performances of different parts of the system used

    A new CDF model for data movement based on SRM

    Get PDF
    Being a large international collaboration established well before the full development of the Grid as the main computing tool for High Energy Physics, CDF has recently changed and improved its computing model, decentralizing some parts of it in order to be able to exploit the rising number of distributed resources available nowadays. Despite those efforts, while the large majority of CDF Monte Carlo production has moved to the Grid, data processing is still mainly performed in dedicated farms hosted at FNAL, requiring a centralized management of data and Monte Carlo samples needed for physics analysis. This rises the question on how to manage the transfer of produced Monte Carlo samples from remote Grid sites to FNAL in an efficient way; up to now CDF has relied on a non scalable centralized solution based on dedicated data servers accessed through rcp protocol, which has proven to be unsatisfactory. A new data transfer model has been designed that uses SRMs as local caches for remote Monte Carlo production sites, interfaces them with SAM, the experiment data catalog, and finally realizes the file movement exploiting the features provided by the data catalog transfer layer. We describe here the model and its integration within the current CDF computing architecture

    Crilin: A Semi-Homogeneous Calorimeter for a Future Muon Collider

    Get PDF
    Calorimeters, as other detectors, have to face the increasing performance demands of the new energy frontier experiments. For a future Muon Collider the main challenge is given by the Beam Induced Background that may pose limitations to the physics performance. However, it is possible to reduce the BIB impact by exploiting some of its characteristics by ensuring high granularity, excellent timing, longitudinal segmentation and good energy resolution. The proposed design, the Crilin calorimeter, is an alternative semi-homogeneous ECAL barrel for the Muon Collider based on Lead Fluoride Crystals (PbF2) with a surface-mount UV-extended Silicon Photomultipliers (SiPMs) readout with an optimized design for a future Muon Collider

    Quantum Computing for High-Energy Physics: State of the Art and Challenges. Summary of the QC4HEP Working Group

    Full text link
    Quantum computers offer an intriguing path for a paradigmatic change of computing in the natural sciences and beyond, with the potential for achieving a so-called quantum advantage, namely a significant (in some cases exponential) speed-up of numerical simulations. The rapid development of hardware devices with various realizations of qubits enables the execution of small scale but representative applications on quantum computers. In particular, the high-energy physics community plays a pivotal role in accessing the power of quantum computing, since the field is a driving source for challenging computational problems. This concerns, on the theoretical side, the exploration of models which are very hard or even impossible to address with classical techniques and, on the experimental side, the enormous data challenge of newly emerging experiments, such as the upgrade of the Large Hadron Collider. In this roadmap paper, led by CERN, DESY and IBM, we provide the status of high-energy physics quantum computations and give examples for theoretical and experimental target benchmark applications, which can be addressed in the near future. Having the IBM 100 x 100 challenge in mind, where possible, we also provide resource estimates for the examples given using error mitigated quantum computing
    • …
    corecore