485 research outputs found
Lamarr: LHCb ultra-fast simulation based on machine learning models deployed within Gauss
About 90% of the computing resources available to the LHCb experiment has
been spent to produce simulated data samples for Run 2 of the Large Hadron
Collider at CERN. The upgraded LHCb detector will be able to collect larger
data samples, requiring many more simulated events to analyze the data to be
collected in Run 3. Simulation is a key necessity of analysis to interpret
signal, reject background and measure efficiencies. The needed simulation will
far exceed the pledged resources, requiring an evolution in technologies and
techniques to produce these simulated data samples. In this contribution, we
discuss Lamarr, a Gaudi-based framework to speed-up the simulation production
parameterizing both the detector response and the reconstruction algorithms of
the LHCb experiment. Deep Generative Models powered by several algorithms and
strategies are employed to effectively parameterize the high-level response of
the single components of the LHCb detector, encoding within neural networks the
experimental errors and uncertainties introduced in the detection and
reconstruction phases. Where possible, models are trained directly on real
data, statistically subtracting any background components by applying
appropriate reweighing procedures. Embedding Lamarr in the general LHCb Gauss
Simulation framework allows to combine its execution with any of the available
generators in a seamless way. The resulting software package enables a
simulation process independent of the detailed simulation used to date.Comment: Under review in Journal of Physics: Conference Series (ACAT 2022
Towards Reliable Neural Generative Modeling of Detectors
The increasing luminosities of future data taking at Large Hadron Collider
and next generation collider experiments require an unprecedented amount of
simulated events to be produced. Such large scale productions demand a
significant amount of valuable computing resources. This brings a demand to use
new approaches to event generation and simulation of detector responses. In
this paper, we discuss the application of generative adversarial networks
(GANs) to the simulation of the LHCb experiment events. We emphasize main
pitfalls in the application of GANs and study the systematic effects in detail.
The presented results are based on the Geant4 simulation of the LHCb Cherenkov
detector.Comment: 6 pages, 4 figure
The LHCb ultra-fast simulation option, Lamarr: design and validation
Detailed detector simulation is the major consumer of CPU resources at LHCb,
having used more than 90% of the total computing budget during Run 2 of the
Large Hadron Collider at CERN. As data is collected by the upgraded LHCb
detector during Run 3 of the LHC, larger requests for simulated data samples
are necessary, and will far exceed the pledged resources of the experiment,
even with existing fast simulation options. An evolution of technologies and
techniques to produce simulated samples is mandatory to meet the upcoming needs
of analysis to interpret signal versus background and measure efficiencies. In
this context, we propose Lamarr, a Gaudi-based framework designed to offer the
fastest solution for the simulation of the LHCb detector. Lamarr consists of a
pipeline of modules parameterizing both the detector response and the
reconstruction algorithms of the LHCb experiment. Most of the parameterizations
are made of Deep Generative Models and Gradient Boosted Decision Trees trained
on simulated samples or alternatively, where possible, on real data. Embedding
Lamarr in the general LHCb Gauss Simulation framework allows combining its
execution with any of the available generators in a seamless way. Lamarr has
been validated by comparing key reconstructed quantities with Detailed
Simulation. Good agreement of the simulated distributions is obtained with
two-order-of-magnitude speed-up of the simulation phase.Comment: Under review in EPJ Web of Conferences (CHEP 2023
The LHCb ultra-fast simulation option, Lamarr design and validation
Detailed detector simulation is the major consumer of CPU resources at LHCb, having used more than 90% of the total computing budget during Run 2 of the Large Hadron Collider at CERN. As data is collected by the upgraded LHCb detector during Run 3 of the LHC, larger requests for simulated data samples are necessary, and will far exceed the pledged resources of the experiment, even with existing fast simulation options. The evolution of technologies and techniques for simulation production is then mandatory to meet the upcoming needs for the analysis of most of the data collected by the LHCb experiment. In this context, we propose Lamarr, a Gaudi-based framework designed to offer the fastest solution for the simulation of the LHCb detector. Lamarr consists of a pipeline of modules parameterizing both the detector response and the reconstruction algorithms of the LHCb experiment. Most of the parameterizations are made of Deep Generative Models and Gradient Boosted Decision Trees trained on simulated samples or alternatively, where possible, on real data. Embedding Lamarr in the general LHCb Gauss Simulation framework allows combining its execution with any of the available generators in a seamless way. Lamarr has been validated by comparing key reconstructed quantities with Detailed Simulation. Good agreement of the simulated distributions is obtained with two order of magnitude speed-up of the simulation phase
Multidifferential study of identified charged hadron distributions in -tagged jets in proton-proton collisions at 13 TeV
Jet fragmentation functions are measured for the first time in proton-proton
collisions for charged pions, kaons, and protons within jets recoiling against
a boson. The charged-hadron distributions are studied longitudinally and
transversely to the jet direction for jets with transverse momentum 20 GeV and in the pseudorapidity range . The
data sample was collected with the LHCb experiment at a center-of-mass energy
of 13 TeV, corresponding to an integrated luminosity of 1.64 fb. Triple
differential distributions as a function of the hadron longitudinal momentum
fraction, hadron transverse momentum, and jet transverse momentum are also
measured for the first time. This helps constrain transverse-momentum-dependent
fragmentation functions. Differences in the shapes and magnitudes of the
measured distributions for the different hadron species provide insights into
the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any
supplementary material and additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb
public pages
Study of the decay
The decay is studied
in proton-proton collisions at a center-of-mass energy of TeV
using data corresponding to an integrated luminosity of 5
collected by the LHCb experiment. In the system, the
state observed at the BaBar and Belle experiments is
resolved into two narrower states, and ,
whose masses and widths are measured to be where the first uncertainties are statistical and the second
systematic. The results are consistent with a previous LHCb measurement using a
prompt sample. Evidence of a new
state is found with a local significance of , whose mass and width
are measured to be and , respectively. In addition, evidence of a new decay mode
is found with a significance of
. The relative branching fraction of with respect to the
decay is measured to be , where the first
uncertainty is statistical, the second systematic and the third originates from
the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb
public pages
Measurement of the ratios of branching fractions and
The ratios of branching fractions
and are measured, assuming isospin symmetry, using a
sample of proton-proton collision data corresponding to 3.0 fb of
integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The
tau lepton is identified in the decay mode
. The measured values are
and
, where the first uncertainty is
statistical and the second is systematic. The correlation between these
measurements is . Results are consistent with the current average
of these quantities and are at a combined 1.9 standard deviations from the
predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb
public pages
Studio delle risonanze di stati di charmonium nei decadimenti e con l'esperimento LHCb al CERN
Il lavoro di tesi svolto Ăš un approfondimento di fisica adronica incentrato sullo studio degli stati di charmonium e delle relative tecniche sperimentali di rivelazione e analisi. Grazie ad un finanziamento bandito dallâUniversitĂ di Firenze, ottenuto tramite concorso pubblico, ho trascorso parte del periodo di tesi presso lâesperimento LHCb al CERN occupandomi, oltre che dellâanalisi dati, della caratterizzazione di sistemi di reti rilevanti per lâupgrade di LHCb che sarĂ installato a partire dal 2019. LâattivitĂ di analisi dati Ăš stata favorita dalle risorse informatiche offerte dal CERN e si Ăš conclusa con una presentazione al physics working group dedicato allo studio degli stati di quarkonium. In quel contesto, ho richiesto ed ottenuto lâapprovazione per la produzione di campioni di dati simulati necessari a completare lâanalisi. Nel corso del lavoro di tesi, ho avuto lâopportunitĂ di familiarizzare con pacchetti software ampiamente utilizzati in ambito scientifico, quali git, ROOT e Python; ho trascorso alcune giornate nella sala di controllo dellâesperimento durante la presa-dati, oltre ad aver presentato e discusso lo stato del mio lavoro con esperti italiani e stranieri. La presente tesi Ăš incentrata sullâattivitĂ di analisi dei canali di decadimento e che presentano contributi risonanti dovuti a diversi stati di charmonium. Nel capitolo 1 Ăš riportata una breve introduzione al Modello Standard con particolare attenzione alle sue simmetrie e alle implicazioni delle regole di conservazione che ne derivano nel contesto della teoria del quarkonium. Il capitolo 2 Ăš invece dedicato alla descrizione di LHC e dellâapparato sperimentale di LHCb. Nel capitolo 3 Ăš sviluppata lâanalisi del decadimento con , effettuata per la prima volta sui dati raccolti a partire dal 2015, confermando alcuni dei risultati dellâanalisi di Run 1 recentemente pubblicata. Infine, nel capitolo 4 sono descritti gli studi preliminari del canale di decadimento per la ricerca del canale . La strategia di analisi e la sua implementazione sono contributi originali di questo lavoro di tesi. Il capitolo conclusivo Ăš inoltre dedicato alle prospettive future per lo studio degli stati di charmonium ricostruiti in stati finali e
LHCb-Lamarr: LHCb ultra-fast simulation based on machine learning models
About 90% of the computing resources available to the LHCb experiment has been spent to produce simulated data samples for Run 2 of the Large Hadron Collider. The upgraded LHCb detector will operate at much-increased luminosity, requiring many more simulated events for the Run 3. Simulation is a key necessity of analysis to interpret data in terms of signal and background and estimate relevant efficiencies. The amount of simulation required will far exceed the pledged resources, requiring an evolution in technologies and techniques to produce simulated data samples. In this conference contribution, we discuss Lamarr, a Gaudi-based framework to speed-up the simulation production parametrizing both the detector response and the reconstruction algorithms of the LHCb experiment. Deep Generative Models powered by several algorithms and strategies are employed to effectively parameterize the high-level response of the single components of the LHCb detector, encoding within neural networks the experimental errors and uncertainties introduced in the detection and reconstruction phases. Where possible, models are trained directly on real data, statistically subtracting any background components through the application of weights. Embedding Lamarr in the general LHCb simulation framework (Gauss) allows to combine its execution with any of the available generators in a seamless way. The resulting software package enables a simulation process completely independent of the detailed simulation used to date
- âŠ