2,241 research outputs found
Properties and Decays of the meson
Recent studies of properties and decays of the meson by the LHC
experiments are presented. Mass and lifetime measurements are discussed and
some of the many new observed decays are reported.Comment: Presented at the 2014 Flavor Physics and CP Violation (FPCP-2014),
Marseille, France, May 26-30 2014, 10 pages, 6 figure
Measurement of the B+c meson lifetime using the B+c → J/ψμ+νX decays
Using 2fb − 1 of data collected in 2012 at √s = 8TeV, the LHCb Collaboration measured the lifetime of the B+c meson studying the semileptonic decays B+c → J/ψμ+νX. The result, τB+c = 509 ± 8 ± 12 fs, is the world’s best measurement of the B+c lifetime
Fast Data-Driven Simulation of Cherenkov Detectors Using Generative Adversarial Networks
The increasing luminosities of future Large Hadron Collider runs and next
generation of collider experiments will require an unprecedented amount of
simulated events to be produced. Such large scale productions are extremely
demanding in terms of computing resources. Thus new approaches to event
generation and simulation of detector responses are needed. In LHCb, the
accurate simulation of Cherenkov detectors takes a sizeable fraction of CPU
time. An alternative approach is described here, when one generates high-level
reconstructed observables using a generative neural network to bypass low level
details. This network is trained to reproduce the particle species likelihood
function values based on the track kinematic parameters and detector occupancy.
The fast simulation is trained using real data samples collected by LHCb during
run 2. We demonstrate that this approach provides high-fidelity results.Comment: Proceedings for 19th International Workshop on Advanced Computing and
Analysis Techniques in Physics Research. (Fixed typos and added one missing
reference in the revised version.
Towards Reliable Neural Generative Modeling of Detectors
The increasing luminosities of future data taking at Large Hadron Collider
and next generation collider experiments require an unprecedented amount of
simulated events to be produced. Such large scale productions demand a
significant amount of valuable computing resources. This brings a demand to use
new approaches to event generation and simulation of detector responses. In
this paper, we discuss the application of generative adversarial networks
(GANs) to the simulation of the LHCb experiment events. We emphasize main
pitfalls in the application of GANs and study the systematic effects in detail.
The presented results are based on the Geant4 simulation of the LHCb Cherenkov
detector.Comment: 6 pages, 4 figure
Model independent measurements of Standard Model cross sections with Domain Adaptation
With the ever growing amount of data collected by the ATLAS and CMS
experiments at the CERN LHC, fiducial and differential measurements of the
Higgs boson production cross section have become important tools to test the
standard model predictions with an unprecedented level of precision, as well as
seeking deviations that can manifest the presence of physics beyond the
standard model. These measurements are in general designed for being easily
comparable to any present or future theoretical prediction, and to achieve this
goal it is important to keep the model dependence to a minimum. Nevertheless,
the reduction of the model dependence usually comes at the expense of the
measurement precision, preventing to exploit the full potential of the signal
extraction procedure. In this paper a novel methodology based on the machine
learning concept of domain adaptation is proposed, which allows using a complex
deep neural network in the signal extraction procedure while ensuring a minimal
dependence of the measurements on the theoretical modelling of the signal.Comment: 16 pages, 10 figure
Fabrication and First Full Characterisation of Timing Properties of 3D Diamond Detectors
Tracking detectors at future high luminosity hadron colliders are expected to be able to stand unprecedented levels of radiation as well as to efficiently reconstruct a huge number of tracks and primary vertices. To face the challenges posed by the radiation damage, new extremely radiation hard materials and sensor designs will be needed, while the track and vertex reconstruction problem can be significantly mitigated by the introduction of detectors with excellent timing capabilities. Indeed, the time coordinate provides extremely powerful information to disentangle overlapping tracks and hits in the harsh hadronic collision environment. Diamond 3D pixel sensors optimised for timing applications provide an appealing solution to the above problems as the 3D geometry enhances the already outstanding radiation hardness and allows to exploit the excellent timing properties of diamond. We report here the first full timing characterisation of 3D diamond sensors fabricated by electrode laser graphitisation in Florence. Results from a 270MeV pion beam test of a first prototype and from tests with a β source on a recently fabricated 55×55μm2 pitch sensor are discussed. First results on sensor simulation are also presented
Muon identification for LHCb Run 3
Muon identification is of paramount importance for the physics programme of
LHCb. In the upgrade phase, starting from Run 3 of the LHC, the trigger of the
experiment will be solely based on software. The luminosity increase to
cms will require an improvement of the muon
identification criteria, aiming at performances equal or better than those of
Run 2, but in a much more challenging environment. In this paper, two new muon
identification algorithms developed in view of the LHCb upgrade are presented,
and their performance in terms of signal efficiency versus background reduction
is shown
The LHCb ultra-fast simulation option, Lamarr: design and validation
Detailed detector simulation is the major consumer of CPU resources at LHCb,
having used more than 90% of the total computing budget during Run 2 of the
Large Hadron Collider at CERN. As data is collected by the upgraded LHCb
detector during Run 3 of the LHC, larger requests for simulated data samples
are necessary, and will far exceed the pledged resources of the experiment,
even with existing fast simulation options. An evolution of technologies and
techniques to produce simulated samples is mandatory to meet the upcoming needs
of analysis to interpret signal versus background and measure efficiencies. In
this context, we propose Lamarr, a Gaudi-based framework designed to offer the
fastest solution for the simulation of the LHCb detector. Lamarr consists of a
pipeline of modules parameterizing both the detector response and the
reconstruction algorithms of the LHCb experiment. Most of the parameterizations
are made of Deep Generative Models and Gradient Boosted Decision Trees trained
on simulated samples or alternatively, where possible, on real data. Embedding
Lamarr in the general LHCb Gauss Simulation framework allows combining its
execution with any of the available generators in a seamless way. Lamarr has
been validated by comparing key reconstructed quantities with Detailed
Simulation. Good agreement of the simulated distributions is obtained with
two-order-of-magnitude speed-up of the simulation phase.Comment: Under review in EPJ Web of Conferences (CHEP 2023
- …