1,714 research outputs found

    Efficiency measurement of b-tagging algorithms developed by the CMS experiment

    Full text link
    Identification of jets originating from b quarks (b-tagging) is a key element of many physics analyses at the LHC. Various algorithms for b-tagging have been developed by the CMS experiment to identify b-tagged jets with a typical efficiency between 40% and 70% while keeping the rate of misidentified light quark jets between 0.1% and 10%. An important step, in order to be able to use these tools in physics analysis, is the determination of the efficiency for tagging b-jets. Several methods to measure the efficiencies of the lifetime based b-tagging algorithms are presented. Events that have jets with muons are used to enrich a jet sample in heavy flavor content. The efficiency measurement relies on the transverse momentum of the muon relative to the jet axis or on solving a system of equations which incorporate two uncorrelated taggers. Another approach uses the number of b-tagged jets in top pair events to estimate the efficiency. The results obtained in 2010 data and the uncertainties obtained with the different techniques are reported. The rate of misidentified light quarks have been measured using the "negative" tagging technique.Comment: 8 pages, 5 figures, Proceedings of the DPF-2011 Conference, Providence, RI, August 8-13, 201

    Search for Exotic Top Partners at sqrt(s) = 8 TeV

    Full text link
    We present searches for heavy top and bottom quark partners at CMS using data collected at sqrt(s) = 8 TeV. Such partners, if vector-like, occur in models such as the Little Higgs and Large Extra Dimensions. Fermionic top partners could also occur in composite Higgs models. The searches presented here span a wide range of final states, from lepton plus jets to multi-leptonic, and exclusion limits are set on mass and production cross sections as a function of branching ratios of the heavy quarks to their decay products.Comment: 10 pages, 12 figure

    A sensitivity study of triboson production processes to dimension-6 EFT operators at the LHC

    Full text link
    We present the first parton-level study of anomalous effects in triboson production in both fully and semi-leptonic channels in proton-proton collisions at 13 TeV at the Large Hadron Collider (LHC). The sensitivity to anomalies induced by a minimal set of bosonic dimension-6 operators from the Warsaw basis is evaluated with specific analyses for each final state. A likelihood-based strategy is employed to assess the most sensitive kinematic observables per channel, where the contribution of Effective Field Theory operators is parameterized at either the linear or quadratic level. The impact of the mutual interference terms of pairs of operators on the sensitivity is also examined. This benchmark study explores the complementarity and overlap in sensitivity between different triboson measurements and paves the way for future analyses at the LHC experiments. The statistical combination of the considered final states allows setting stringent bounds on five bosonic Wilson coefficients

    Czochralski Silicon as a Detector Material for S-LHC Tracker Volumes

    Get PDF
    4 pages, 6 figures, 12th Vienna Conference on InstrumentationWith an expected ten-fold increase in luminosity in S-LHC, the radiation environment in the tracker volumes will be considerably harsher for silicon-based detectors than the already harsh LHC environment. Since 2006, a group of CMS institutes, using a modified CMS DAQ system, has been exploring the use of Magnetic Czochralski silicon as a detector element for the strip tracker layers in S-LHC experiments. Both p+/n-/n+ and n+/p-/p+ sensors have been characterized, irradiated with proton and neutron sources, assembled into modules, and tested in a CERN beamline. There have been three beam studies to date and results from these suggest that both p+/n-/n+ and n+/p-/p+ Magnetic Czochralski silicon are sufficiently radiation hard for the R>25R>25 cm regions of S-LHC tracker volumes. The group has also explored the use of forward biasing for heavily irradiated detectors, and although this mode requires sensor temperatures less than -50\,‚ąė^\circC, the charge collection efficiency appears to be promising.Peer reviewe

    Report of the Topical Group on Electroweak Precision Physics and Constraining New Physics for Snowmass 2021

    Full text link
    The precise measurement of physics observables and the test of their consistency within the standard model (SM) are an invaluable approach, complemented by direct searches for new particles, to determine the existence of physics beyond the standard model (BSM). Studies of massive electroweak gauge bosons (W and Z bosons) are a promising target for indirect BSM searches, since the interactions of photons and gluons are strongly constrained by the unbroken gauge symmetries. They can be divided into two categories: (a) Fermion scattering processes mediated by s- or t-channel W/Z bosons, also known as electroweak precision measurements; and (b) multi-boson processes, which include production of two or more vector bosons in fermion-antifermion annihilation, as well as vector boson scattering (VBS) processes. The latter categories can test modifications of gauge-boson self-interactions, and the sensitivity is typically improved with increased collision energy. This report evaluates the achievable precision of a range of future experiments, which depend on the statistics of the collected data sample, the experimental and theoretical systematic uncertainties, and their correlations. In addition it presents a combined interpretation of these results, together with similar studies in the Higgs and top sector, in the Standard Model effective field theory (SMEFT) framework. This framework provides a model-independent prescription to put generic constraints on new physics and to study and combine large sets of experimental observables, assuming that the new physics scales are significantly higher than the EW scale.Comment: 55 pages; Report of the EF04 topical group for Snowmass 202

    Performance of the CMS High Granularity Calorimeter prototype to charged pion beams of 20‚ąí-300 GeV/c

    Full text link
    The upgrade of the CMS experiment for the high luminosity operation of the LHC comprises the replacement of the current endcap calorimeter by a high granularity sampling calorimeter (HGCAL). The electromagnetic section of the HGCAL is based on silicon sensors interspersed between lead and copper (or copper tungsten) absorbers. The hadronic section uses layers of stainless steel as an absorbing medium and silicon sensors as an active medium in the regions of high radiation exposure, and scintillator tiles directly readout by silicon photomultipliers in the remaining regions. As part of the development of the detector and its readout electronic components, a section of a silicon-based HGCAL prototype detector along with a section of the CALICE AHCAL prototype was exposed to muons, electrons and charged pions in beam test experiments at the H2 beamline at the CERN SPS in October 2018. The AHCAL uses the same technology as foreseen for the HGCAL but with much finer longitudinal segmentation. The performance of the calorimeters in terms of energy response and resolution, longitudinal and transverse shower profiles is studied using negatively charged pions, and is compared to GEANT4 predictions. This is the first report summarizing results of hadronic showers measured by the HGCAL prototype using beam test data.Comment: To be submitted to JINS

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function √į√į¬•with constraints√į √į √į¬• ¬• √įand√į¬ī√į¬• = √į. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Differential cross section measurements for the production of a W boson in association with jets in proton‚Äďproton collisions at ‚ąös = 7 TeV