685 research outputs found

    Triggering new discoveries: development of advanced hlt1 algorithms for detection of long-lived particles at lhcb

    Get PDF
    The work presented in this thesis constitutes a significant contribution to the first high level trigger (HLT1) of the LHCb experiment, based on the Allen project. In Allen, the entire HLT1 sequence of reconstruction algorithms have been designed to be executed on GPU cards. The work in this thesis has contributed to propel the project forward, enabling the LHCb trigger during Run3 to successfully select real-time events at a frequency of 30 MHz. An extensive effort has been performed during the Allen development program, leading to the creation of Allen performance portability layer which enables framework to be executed in several architectures. Furthermore, inside this framework several key algorithms have been developed. One of these algorithms, termed HybridSeeding, efficiently reconstructs the tracks produced in the SciFi detector (T-tracks). Another algorithm, named VELO-SciFi Matching, building upon the former, allows the reconstruction of long tracks with a momentum precision better than 1%. Additionally, a new algorithm named Downstream has been conceived, developed and incorporated into HLT1 for the first time. A fast and efficient search of hits in the UT detector is performed, and a fast neural network (NN) is applied to reject ghost tracks. It allows to reconstruct downstream tracks with an efficiency of 70% and a ghost rate below 20%. This is the first time that a NN has been developed for GPUs inside Allen. This new algorithm will allow the selection of long-lived particles at HLT1 level, opening up an unprecedented realm within both the Standard Model and its extensions. Of particular note is its implication in expanding the search scope for exotic long-lived particles, spanning from 100 ps to several nanoseconds, a domain unexplored until now by the LHCb experiment. This, in turn, enhances the sensitivity to new particles predicted by theories that include a dark sector, heavy neutral leptons, supersymmetry, or axion-like particles. In addition, the LHCb ability to detect particles from the Standard Model, such as lambda and Ks, is greatly augmented, thereby enhancing the precision of analyses involving b and c hadron decays. The integration of the HLT1 selection lines derived from the Downstream algorithm into the LHCb real-time monitoring infrastructure will be important for the data taking during Run3 and beyond, and notably for the present alignment and calibration of the UT detector. The precision in measuring observables which are sensitive to physics beyond the Standard Model, such as the rare lambda_b to lambda_gamma decay channel, will be greatly augmented. In this thesis a study of the measurement of the branching fraction of the lambda_b to lambda_gamma decay relative to the B to K*gamma channel has been performed. The analysis procedure, including selection, reconstruction and background rejection, has been described. An evaluation of the main systematic uncertainties affecting the measurement has been included. It has been concluded that the statistical precision for Run3 will be below 2% as a result of the inclusion of downstream tracks. The measurement of the photon polarisation in these transitions will also benefit from the increase in the yield, reaching a 10% precision in the alpha_gamma parameter. Measurements of the CP asymmetry in lambda_b to lambda_gamma decays will also reach higher precision

    Tracking performance for long-lived particles at LHCb

    Full text link
    The LHCb experiment is dedicated to the study of the c−c- and b−b-hadron decays, including long-lived particles such as KsK_s and strange baryons (Λ0\Lambda^0, Ξ−\Xi^-, etc... ). These kind of particles are difficult to reconstruct by the LHCb tracking system since they escape detection in the first tracker. A new method to evaluate the performance of the different tracking algorithms for long-lived particles using real data samples has been developed. Special emphasis is laid on particles hitting only part of the tracking system of the new LHCb upgrade detector.Comment: Proceeding for Connecting the Dots and Workshop on Intelligent Trackers (CTD/WIT 2019

    Radiative bb-baryon decays to measure the photon and bb-baryon polarization

    Full text link
    The radiative decays of bb-baryons facilitate the direct measurement of photon helicity in b→sγb\to s\gamma transitions thus serving as an important test of physics beyond the Standard Model. In this paper we analyze the complete angular distribution of ground state bb-baryon (Λb0\Lambda_{b}^{0} and Ξb−\Xi_{b}^{-}) radiative decays to multibody final states assuming an initially polarized bb-baryon sample. Our sensitivity study suggests that the photon polarization asymmetry can be extracted to a good accuracy along with a simultaneous measurement of the initial bb-baryon polarization. With higher yields of bb-baryons, achievable in subsequent runs of the Large Hadron Collider (LHC), we find that the photon polarization measurement can play a pivotal role in constraining different new physics scenarios.Comment: Typos corrected, reference adde

    The CMS monitoring applications for LHC Run 3

    Get PDF
    Data taking at the Large Hadron Collider (LHC) at CERN restarted in 2022. The CMS experiment relies on a distributed computing infrastructure based on WLCG (Worldwide LHC Computing Grid) to support the LHC Run 3 physics program. The CMS computing infrastructure is highly heterogeneous and relies on a set of centrally provided services, such as distributed workload management and data management, and computing resources hosted at almost 150 sites worldwide. Smooth data taking and processing requires all computing subsystems to be fully operational, and available computing and storage resources need to be continuously monitored. During the long shutdown between LHC Run 2 and Run 3, the CMS monitoring infrastructure has undergone major changes to increase the coverage of monitored applications and services, while becoming more sustainable and easier to operate and maintain. The used technologies are based on open-source solutions, either provided by the CERN IT department through the MONIT infrastructure, or managed by the CMS monitoring team. Monitoring applications for distributed workload management, submission infrastructure based on HTCondor, distributed data management, facilities have been ported from mostly custom-built applications to use common data flow and visualization services. Data are mostly stored in non-SQL databases and storage technologies such as ElasticSearch, VictoriaMetrics, Prometheus, InfluxDB and HDFS, and accessed either via programmatic APIs, Apache Spark or Sqoop jobs, or visualized preferentially using Grafana. Most CMS monitoring applications are deployed on Kubernetes clusters to minimize maintenance operations. In this contribution we present the full stack of CMS monitoring services and show how we leveraged the use of common technologies to cover a variety of monitoring applications and cope with the computing challenges of LHC Run 3

    LHCb potential to discover long-lived new physics particles with lifetimes above 100 ps

    Get PDF
    For years, it has been believed that the main LHC detectors can play only a limited role of a lifetime frontier experiment exploring the parameter space of long-lived particles (LLPs)—hypothetical particles with tiny couplings to the Standard Model. This paper demonstrates that the LHCb experiment may become a powerful lifetime frontier experiment if it uses the new Downstream algorithm reconstructing tracks that do not allow hits in the LHCb vertex tracker. In particular, for many LLP scenarios, LHCb may be as sensitive as the proposed experiments beyond the main LHC detectors for various LLP models, including heavy neutral leptons, dark scalars, dark photons, and axion-like particles

    LHCb potential to discover long-lived new physics particles with lifetimes above 100 ps

    Get PDF
    For years, it has been believed that the main LHC detectors can only restrictively play the role of a lifetime frontier experiment exploring the parameter space of long-lived particles (LLPs) - hypothetical particles with tiny couplings to the Standard Model. This paper demonstrates that the LHCb experiment may become a powerful lifetime frontier experiment if it uses the new Downstream algorithm reconstructing tracks that do not let hits in the LHCb vertex tracker. In particular, for many LLP scenarios, LHCb may be as sensitive as the proposed experiments beyond main LHC detectors for various LLP models, including heavy neutral leptons, dark scalars, dark photons, and axion-like particles

    Impact of the high-level trigger for detecting long-lived particles at LHCb

    Get PDF
    Long-lived particles (LLPs) are very challenging to search for with current detectors and computing requirements due to their very displaced vertices. This study evaluates the ability of the trigger algorithms used in the Large Hadron Collider beauty (LHCb) experiment to detect long-lived particles and attempts to adapt them to enhance the sensitivity of this experiment to undiscovered long-lived particles. One of the challenges in the track reconstruction is to deal with the large amount of combinatorics of hits. A dedicated algorithm has been developed to cope with the large data output. When fully implemented, this algorithm would greatly increase the efficiency for any long-lived particle reconstruction in the forward region, for the Standard Model of particle physics and beyond

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Multidifferential study of identified charged hadron distributions in ZZ-tagged jets in proton-proton collisions at s=\sqrt{s}=13 TeV

    Full text link
    Jet fragmentation functions are measured for the first time in proton-proton collisions for charged pions, kaons, and protons within jets recoiling against a ZZ boson. The charged-hadron distributions are studied longitudinally and transversely to the jet direction for jets with transverse momentum 20 <pT<100< p_{\textrm{T}} < 100 GeV and in the pseudorapidity range 2.5<η<42.5 < \eta < 4. The data sample was collected with the LHCb experiment at a center-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 1.64 fb−1^{-1}. Triple differential distributions as a function of the hadron longitudinal momentum fraction, hadron transverse momentum, and jet transverse momentum are also measured for the first time. This helps constrain transverse-momentum-dependent fragmentation functions. Differences in the shapes and magnitudes of the measured distributions for the different hadron species provide insights into the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any supplementary material and additional information, are available at https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb public pages

    Study of the B−→Λc+Λˉc−K−B^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} decay

    Full text link
    The decay B−→Λc+Λˉc−K−B^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} is studied in proton-proton collisions at a center-of-mass energy of s=13\sqrt{s}=13 TeV using data corresponding to an integrated luminosity of 5 fb−1\mathrm{fb}^{-1} collected by the LHCb experiment. In the Λc+K−\Lambda_{c}^+ K^{-} system, the Ξc(2930)0\Xi_{c}(2930)^{0} state observed at the BaBar and Belle experiments is resolved into two narrower states, Ξc(2923)0\Xi_{c}(2923)^{0} and Ξc(2939)0\Xi_{c}(2939)^{0}, whose masses and widths are measured to be m(Ξc(2923)0)=2924.5±0.4±1.1 MeV,m(Ξc(2939)0)=2938.5±0.9±2.3 MeV,Γ(Ξc(2923)0)=0004.8±0.9±1.5 MeV,Γ(Ξc(2939)0)=0011.0±1.9±7.5 MeV, m(\Xi_{c}(2923)^{0}) = 2924.5 \pm 0.4 \pm 1.1 \,\mathrm{MeV}, \\ m(\Xi_{c}(2939)^{0}) = 2938.5 \pm 0.9 \pm 2.3 \,\mathrm{MeV}, \\ \Gamma(\Xi_{c}(2923)^{0}) = \phantom{000}4.8 \pm 0.9 \pm 1.5 \,\mathrm{MeV},\\ \Gamma(\Xi_{c}(2939)^{0}) = \phantom{00}11.0 \pm 1.9 \pm 7.5 \,\mathrm{MeV}, where the first uncertainties are statistical and the second systematic. The results are consistent with a previous LHCb measurement using a prompt Λc+K−\Lambda_{c}^{+} K^{-} sample. Evidence of a new Ξc(2880)0\Xi_{c}(2880)^{0} state is found with a local significance of 3.8 σ3.8\,\sigma, whose mass and width are measured to be 2881.8±3.1±8.5 MeV2881.8 \pm 3.1 \pm 8.5\,\mathrm{MeV} and 12.4±5.3±5.8 MeV12.4 \pm 5.3 \pm 5.8 \,\mathrm{MeV}, respectively. In addition, evidence of a new decay mode Ξc(2790)0→Λc+K−\Xi_{c}(2790)^{0} \to \Lambda_{c}^{+} K^{-} is found with a significance of 3.7 σ3.7\,\sigma. The relative branching fraction of B−→Λc+Λˉc−K−B^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} with respect to the B−→D+D−K−B^{-} \to D^{+} D^{-} K^{-} decay is measured to be 2.36±0.11±0.22±0.252.36 \pm 0.11 \pm 0.22 \pm 0.25, where the first uncertainty is statistical, the second systematic and the third originates from the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and additional information, are available at https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb public pages
    • 

    corecore