677 research outputs found
Triggering new discoveries: development of advanced hlt1 algorithms for detection of long-lived particles at lhcb
The work presented in this thesis constitutes a significant contribution to the first high level trigger (HLT1) of the LHCb experiment, based on the Allen project. In Allen, the entire HLT1 sequence of reconstruction algorithms have been designed to be executed on GPU cards. The work in this thesis has contributed to propel the project forward, enabling the LHCb trigger during Run3 to successfully select real-time events at a frequency of 30 MHz. An extensive effort has been performed during the Allen development program, leading to the creation of Allen performance portability layer which enables framework to be executed in several architectures. Furthermore, inside this framework several key algorithms have been developed. One of these algorithms, termed HybridSeeding, efficiently reconstructs the tracks produced in the SciFi detector (T-tracks). Another algorithm, named VELO-SciFi Matching, building upon the former, allows the reconstruction of long tracks with a momentum precision better than 1%. Additionally, a new algorithm named Downstream has been conceived, developed and incorporated into HLT1 for the first time. A fast and efficient search of hits in the UT detector is performed, and a fast neural network (NN) is applied to reject ghost tracks. It allows to reconstruct downstream tracks with an efficiency of 70% and a ghost rate below 20%. This is the first time that a NN has been developed for GPUs inside Allen. This new algorithm will allow the selection of long-lived particles at HLT1 level, opening up an unprecedented realm within both the Standard Model and its extensions. Of particular note is its implication in expanding the search scope for exotic long-lived particles, spanning from 100 ps to several nanoseconds, a domain unexplored until now by the LHCb experiment. This, in turn, enhances the sensitivity to new particles predicted by theories that include a dark sector, heavy neutral leptons, supersymmetry, or axion-like particles. In addition, the LHCb ability to detect particles from the Standard Model, such as lambda and Ks, is greatly augmented, thereby enhancing the precision of analyses involving b and c hadron decays. The integration of the HLT1 selection lines derived from the Downstream algorithm into the LHCb real-time monitoring infrastructure will be important for the data taking during Run3 and beyond, and notably for the present alignment and calibration of the UT detector. The precision in measuring observables which are sensitive to physics beyond the Standard Model, such as the rare lambda_b to lambda_gamma decay channel, will be greatly augmented. In this thesis a study of the measurement of the branching fraction of the lambda_b to lambda_gamma decay relative to the B to K*gamma channel has been performed. The analysis procedure, including selection, reconstruction and background rejection, has been described. An evaluation of the main systematic uncertainties affecting the measurement has been included. It has been concluded that the statistical precision for Run3 will be below 2% as a result of the inclusion of downstream tracks. The measurement of the photon polarisation in these transitions will also benefit from the increase in the yield, reaching a 10% precision in the alpha_gamma parameter. Measurements of the CP asymmetry in lambda_b to lambda_gamma decays will also reach higher precision
The CMS monitoring applications for LHC Run 3
Data taking at the Large Hadron Collider (LHC) at CERN restarted in 2022. The CMS experiment relies on a distributed computing infrastructure based on WLCG (Worldwide LHC Computing Grid) to support the LHC Run 3 physics program. The CMS computing infrastructure is highly heterogeneous and relies on a set of centrally provided services, such as distributed workload management and data management, and computing resources hosted at almost 150 sites worldwide. Smooth data taking and processing requires all computing subsystems to be fully operational, and available computing and storage resources need to be continuously monitored. During the long shutdown between LHC Run 2 and Run 3, the CMS monitoring infrastructure has undergone major changes to increase the coverage of monitored applications and services, while becoming more sustainable and easier to operate and maintain. The used technologies are based on open-source solutions, either provided by the CERN IT department through the MONIT infrastructure, or managed by the CMS monitoring team. Monitoring applications for distributed workload management, submission infrastructure based on HTCondor, distributed data management, facilities have been ported from mostly custom-built applications to use common data flow and visualization services. Data are mostly stored in non-SQL databases and storage technologies such as ElasticSearch, VictoriaMetrics, Prometheus, InfluxDB and HDFS, and accessed either via programmatic APIs, Apache Spark or Sqoop jobs, or visualized preferentially using Grafana. Most CMS monitoring applications are deployed on Kubernetes clusters to minimize maintenance operations. In this contribution we present the full stack of CMS monitoring services and show how we leveraged the use of common technologies to cover a variety of monitoring applications and cope with the computing challenges of LHC Run 3
LHCb potential to discover long-lived new physics particles with lifetimes above 100 ps
For years, it has been believed that the main LHC detectors can play only a limited role of a lifetime frontier experiment exploring the parameter space of long-lived particles (LLPs)âhypothetical particles with tiny couplings to the Standard Model. This paper demonstrates that the LHCb experiment may become a powerful lifetime frontier experiment if it uses the new Downstream algorithm reconstructing tracks that do not allow hits in the LHCb vertex tracker. In particular, for many LLP scenarios, LHCb may be as sensitive as the proposed experiments beyond the main LHC detectors for various LLP models, including heavy neutral leptons, dark scalars, dark photons, and axion-like particles
LHCb potential to discover long-lived new physics particles with lifetimes above 100 ps
For years, it has been believed that the main LHC detectors can only restrictively play the role of a lifetime frontier experiment exploring the parameter space of long-lived particles (LLPs) - hypothetical particles with tiny couplings to the Standard Model. This paper demonstrates that the LHCb experiment may become a powerful lifetime frontier experiment if it uses the new Downstream algorithm reconstructing tracks that do not let hits in the LHCb vertex tracker. In particular, for many LLP scenarios, LHCb may be as sensitive as the proposed experiments beyond main LHC detectors for various LLP models, including heavy neutral leptons, dark scalars, dark photons, and axion-like particles
Multidifferential study of identified charged hadron distributions in -tagged jets in proton-proton collisions at 13 TeV
Jet fragmentation functions are measured for the first time in proton-proton
collisions for charged pions, kaons, and protons within jets recoiling against
a boson. The charged-hadron distributions are studied longitudinally and
transversely to the jet direction for jets with transverse momentum 20 GeV and in the pseudorapidity range . The
data sample was collected with the LHCb experiment at a center-of-mass energy
of 13 TeV, corresponding to an integrated luminosity of 1.64 fb. Triple
differential distributions as a function of the hadron longitudinal momentum
fraction, hadron transverse momentum, and jet transverse momentum are also
measured for the first time. This helps constrain transverse-momentum-dependent
fragmentation functions. Differences in the shapes and magnitudes of the
measured distributions for the different hadron species provide insights into
the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any
supplementary material and additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb
public pages
Study of the decay
The decay is studied
in proton-proton collisions at a center-of-mass energy of TeV
using data corresponding to an integrated luminosity of 5
collected by the LHCb experiment. In the system, the
state observed at the BaBar and Belle experiments is
resolved into two narrower states, and ,
whose masses and widths are measured to be where the first uncertainties are statistical and the second
systematic. The results are consistent with a previous LHCb measurement using a
prompt sample. Evidence of a new
state is found with a local significance of , whose mass and width
are measured to be and , respectively. In addition, evidence of a new decay mode
is found with a significance of
. The relative branching fraction of with respect to the
decay is measured to be , where the first
uncertainty is statistical, the second systematic and the third originates from
the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb
public pages
Measurement of the ratios of branching fractions and
The ratios of branching fractions
and are measured, assuming isospin symmetry, using a
sample of proton-proton collision data corresponding to 3.0 fb of
integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The
tau lepton is identified in the decay mode
. The measured values are
and
, where the first uncertainty is
statistical and the second is systematic. The correlation between these
measurements is . Results are consistent with the current average
of these quantities and are at a combined 1.9 standard deviations from the
predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb
public pages
Triggering new discoveries: development of advanced HLT1 algorithms for detection of long-lived particles at LHCb
The work presented in this thesis constitutes a significant contribution to the first high level trigger (HLT1) of the LHCb experiment, based on the Allen project. In Allen, the entire HLT1 sequence of reconstruction algorithms has been designed to be executed on GPU cards. The work in this thesis has contributed to propel the project forward, enabling the LHCb trigger during the Run3, to successfully select real-time events at a frequency of MHz. An extensive effort has been performed during the Allen development program, leading to the creation of a Allen performance portability layer which enables framework to be executed in several architectures. Furthermore, inside this framework contribution to several key algorithms have been presented. One of these algorithms, termed HybridSeeding, efficiently reconstructs the tracks produced in the SciFi detector (T-tracks). Another algorithm, named VELO-SciFi Matching, building upon the former, allows the reconstruction of long tracks with a momentum precision better than . Additionally, a new algorithm named Downstream has been conceived, developed and incorporated into HLT1 for first time. A fast and efficient search of hits in the UT detector is performed, and a fast neural network (NN) is applied to reject ghost tracks. It allows to reconstruct downstream tracks with an efficiency of and a ghost rate below . This is the first time that a NN is developed for GPUs inside Allen. This new algorithm will allow the selection of long-lived particles at HLT1 level, opening up new opportunities within both the Standard Model and its extensions. Of particular note is its implication in expanding the search scope for exotic long-lived particles, spanning from 100 ps to several nanosecons, a domain unexplored until now by the LHCb experiment. This, in turn, enhances the sensitivity to new particles predicted by theories that include a dark sector, heavy neutral leptons, supersymmetry, or axion-like particles. In addition, the LHCbâs ability to detect particles from the Standard Model, such as and K, is greatly augmented, thereby enhancing the precision of analyses involving b and c hadron decays. The integration of the HLT1 selection lines derived from the Downstream algorithm into the LHCbâs real-time monitoring infrastructure will be important for the data taking during Run3 and beyond, and notably for the present alignment and calibration of the UT detector. The precision in measuring observables which are sensitive to physics beyond the Standard Model, such as the rare decay channel, will be greatly augmented. In this thesis a study of the measurement of the branching fraction of the decay relative to the channel has been performed. The analysis procedure, including selection, reconstruction and background rejection, has been described. A evaluation of the main systematic uncertainties affecting the measurement has been included. It has been concluded that the statistical precision for Run3 will be below as a result of the inclusion of downstream tracks. The measurement of the photon polarisation in these transitions will also benefit from the increase in the yield, reaching a precision in the parameter. Measurements of the CP asymmetry in decays will also reach higher precision
- âŠ