1,153 research outputs found

    Global Interpretation of ττ\tau\tau Events in the Context of the Standard Model and Beyond

    Get PDF
    The nature of interactions between elementary particles nowadays is successfully described by the Standard Model of particle physics (SM), with predictions covering a wide range of observed phenomena, including the existence of the Higgs boson, discovered by two independent experiments, ATLAS and CMS. Measurements of its properties are compatible with the SM, however, there is strong belief to consider it as part of an extended Higgs sector, motivating searches for additional heavy Higgs bosons. Until now, measurements of properties of the observed Higgs boson, and their interpretation in models beyond the Standard Model (BSM) in the framework of effective field theories, are performed independently from searches for additional heavy Higgs bosons. The main topic of this thesis is to perform a unification of the two analysis approaches on the example of the HττH\rightarrow\tau\tau decay channel into one, consistent, global interpretation of ττ\tau\tau events, based on Run 2 CMS data at a centre of mass energy of 13 TeV and comprising 137 fb1^{-1}. At first, a measurement of signal strengths of the observed Higgs boson results in an observed (expected) significance of 6.1 (5.0)σ\sigma for gluon fusion and 1.9 (3.8)σ\sigma for vector boson fusion production channels of the Higgs boson to ensure a good expected sensitivity to the observed Higgs boson. Then, a classic search for additional heavy Higgs bosons is performed, superseeding the current CMS results based on data collected in 2016. In the last step, the two analyses are combined into a consistent interpretation of ττ\tau\tau events in the framework of BSM benchmark scenarios to demonstrate an increased sensitivity to deviations in scenario predictions for the BSM Higgs boson to be compatible with the observed Higgs boson, leading to an increased exclusion power on BSM benchmark scenarios

    Performance of the bwHPC cluster in the production of μ -> t embedded events used for the prediction of background for H -> tt analyses

    Get PDF
    In high energy physics, a main challenge is the accurate prediction of background events at a particle detector. These events are usually estimated by simulation. As an alternative, data-driven methods use observed events to derive a background prediction and are often less computationally expensive than simulation. The lepton embedding method presents a data-driven method to estimate the background from Z ! events for Higgs boson analyses in the same final state. Z ! μμ events recorded by the CMS experiment are selected, the muons are removed from the event and replaced with simulated leptons with the same kinematic properties as the removed muons. The resulting hybrid event provides an improved description of pile-up and the underlying event compared to the simulation of the full proton-proton collision. In this paper the production of these hybrid events used by the CMS collaboration is described. The production relies on the resources made available by the bwHPC project. The data used for this purpose correspond to 65 million di-muon events collected in 2017 by CMS

    Efficient interface to the GridKa tape storage system

    Get PDF
    Providing high performance and reliable tape storage system is GridKa’s top priority. The GridKa tape storage system was recently migrated from IBM SP to High Performance Storage System (HPSS) for LHC and non-LHC HEP experiments. These are two different tape backends and each has its own design and specifics that need to be studied thoroughly. Taking into account the features and characteristics of HPSS, a new approach has been developed for flushing and staging files to and from tape storage system. This new approach allows better optimized and efficient flush and stage operations and leads to a substantial improvement in the overall performance of the GridKa tape storage system. The efficient interface that was developed to use IBM SP is now adapted to the HPSS use-case to connect the access point from experiments to the tape storage system. This contribution provides details on these changes and the results of the Tape Challenge 2022 within the new HPSS tape storage configuration

    Modeling Distributed Computing Infrastructures for HEP Applications

    Get PDF
    Predicting the performance of various infrastructure design options in complex federated infrastructures with computing sites distributed over a wide area network that support a plethora of users and workflows, such as the Worldwide LHC Computing Grid (WLCG), is not trivial. Due to the complexity and size of these infrastructures, it is not feasible to deploy experimental test-beds at large scales merely for the purpose of comparing and evaluating alternate designs. An alternative is to study the behaviours of these systems using simulation. This approach has been used successfully in the past to identify efficient and practical infrastructure designs for High Energy Physics (HEP). A prominent example is the Monarc simulation framework, which was used to study the initial structure of the WLCG. New simulation capabilities are needed to simulate large-scale heterogeneous computing systems with complex networks, data access and caching patterns. A modern tool to simulate HEP workloads that execute on distributed computing infrastructures based on the SimGrid and WRENCH simulation frameworks is outlined. Studies of its accuracy and scalability are presented using HEP as a case-study. Hypothetical adjustments to prevailing computing architectures in HEP are studied providing insights into the dynamics of a part of the WLCG and candidates for improvements

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (μ̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ¯ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ¯ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),μ̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    MUSiC : a model-unspecific search for new physics in proton-proton collisions at root s=13TeV

    Get PDF
    Results of the Model Unspecific Search in CMS (MUSiC), using proton-proton collision data recorded at the LHC at a centre-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 35.9 fb(-1), are presented. The MUSiC analysis searches for anomalies that could be signatures of physics beyond the standard model. The analysis is based on the comparison of observed data with the standard model prediction, as determined from simulation, in several hundred final states and multiple kinematic distributions. Events containing at least one electron or muon are classified based on their final state topology, and an automated search algorithm surveys the observed data for deviations from the prediction. The sensitivity of the search is validated using multiple methods. No significant deviations from the predictions have been observed. For a wide range of final state topologies, agreement is found between the data and the standard model simulation. This analysis complements dedicated search analyses by significantly expanding the range of final states covered using a model independent approach with the largest data set to date to probe phase space regions beyond the reach of previous general searches.Peer reviewe

    Measurement of prompt open-charm production cross sections in proton-proton collisions at root s=13 TeV

    Get PDF
    The production cross sections for prompt open-charm mesons in proton-proton collisions at a center-of-mass energy of 13TeV are reported. The measurement is performed using a data sample collected by the CMS experiment corresponding to an integrated luminosity of 29 nb(-1). The differential production cross sections of the D*(+/-), D-+/-, and D-0 ((D) over bar (0)) mesons are presented in ranges of transverse momentum and pseudorapidity 4 < p(T) < 100 GeV and vertical bar eta vertical bar < 2.1, respectively. The results are compared to several theoretical calculations and to previous measurements.Peer reviewe

    Measurement of the azimuthal anisotropy of Y(1S) and Y(2S) mesons in PbPb collisions at root s(NN)=5.02 TeV

    Get PDF
    The second-order Fourier coefficients (v(2)) characterizing the azimuthal distributions of Y(1S) and Y(2S) mesons produced in PbPb collisions at root s(NN) = 5.02 TeV are studied. The Y mesons are reconstructed in their dimuon decay channel, as measured by the CMS detector. The collected data set corresponds to an integrated luminosity of 1.7 nb(-1). The scalar product method is used to extract the v2 coefficients of the azimuthal distributions. Results are reported for the rapidity range vertical bar y vertical bar < 2.4, in the transverse momentum interval 0 < pT < 50 GeV/c, and in three centrality ranges of 10-30%, 30-50% and 50-90%. In contrast to the J/psi mesons, the measured v(2) values for the Y mesons are found to be consistent with zero. (C) 2021 The Author(s). Published by Elsevier B.V.Peer reviewe
    corecore