5,257 research outputs found

    Comparison of Horace and Photos Algorithms for Multi-Photon Emission in the Context of the W Boson Mass Measurement

    Get PDF
    The W boson mass measurement is sensitive to QED radiative corrections due to virtual photon loops and real photon emission. The largest shift in the measured mass, which depends on the transverse momentum spectrum of the charged lepton from the boson decay, is caused by the emission of real photons from the final-state lepton. There are a number of calculations and codes available to model the final-state photon emission. We perform a detailed study, comparing the results from the Horace and Photos implementations of the final-state multi-photon emission in the context of a direct measurement of the W boson mass at the Tevatron. Mass fits are performed using a simulation of the CDF II detector

    Chemical fracture and distribution of extreme values

    Full text link
    When a corrosive solution reaches the limits of a solid sample, a chemical fracture occurs. An analytical theory for the probability of this chemical fracture is proposed and confirmed by extensive numerical experiments on a two dimensional model. This theory follows from the general probability theory of extreme events given by Gumbel. The analytic law differs from the Weibull law commonly used to describe mechanical failures for brittle materials. However a three parameters fit with the Weibull law gives good results, confirming the empirical value of this kind of analysis.Comment: 7 pages, 5 figures, to appear in Europhysics Letter

    Data production models for the CDF experiment

    Get PDF
    The data production for the CDF experiment is conducted on a large Linux PC farm designed to meet the needs of data collection at a maximum rate of 40 MByte/sec. We present two data production models that exploits advances in computing and communication technology. The first production farm is a centralized system that has achieved a stable data processing rate of approximately 2 TByte per day. The recently upgraded farm is migrated to the SAM (Sequential Access to data via Metadata) data handling system. The software and hardware of the CDF production farms has been successful in providing large computing and data throughput capacity to the experiment.Comment: 8 pages, 9 figures; presented at HPC Asia2005, Beijing, China, Nov 30 - Dec 3, 200

    Data processing model for the CDF experiment

    Get PDF
    The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets. The design of the processing control system faces strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required.Comment: 12 pages, 10 figures, submitted to IEEE-TN

    CMS Connect

    Get PDF
    The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users

    Hands off : a handshake interaction detection and localization model for COVID-19 threat control

    Get PDF
    A handshake interaction localization model in real-time that may help mitigate the threat for transmitting COVID-19, is presented using computer vision in a non-intrusive technique. A real-time detection model (using YOLO/you only look once) is proposed to identify handshake interactions in realistic scenarios. YOLO can detect multiple interactions in a single frame. The model can be applied to public spaces to identify handshake interactions. The study is the first to use a human interaction localization model in a multi-person setting. YOLO is a convolutional neural network (CNN) for object detection in real-time.Lewis Power, Singapor

    Snowmass 2021 Computational Frontier CompF4 Topical Group Report Storage and Processing Resource Access

    Get PDF
    The Snowmass 2021 CompF4 topical group’s scope is facilities R&D, where we consider “facilities” as the hardware and software infrastructure inside the data centers plus the networking between data centers, irrespective of who owns them, and what policies are applied for using them. In other words, it includes commercial clouds, federally funded High Performance Computing (HPC) systems for all of science, and systems funded explicitly for a given experimental or theoretical program. However, we explicitly consider any data centers that are integrated into data acquisition systems or trigger of the experiments out of scope here. Those systems tend to have requirements that are quite distinct from the data center functionality required for “offline” processing and storage

    Search for the Higgs boson in events with missing transverse energy and b quark jets produced in proton-antiproton collisions at s**(1/2)=1.96 TeV

    Get PDF
    We search for the standard model Higgs boson produced in association with an electroweak vector boson in events with no identified charged leptons, large imbalance in transverse momentum, and two jets where at least one contains a secondary vertex consistent with the decay of b hadrons. We use ~1 fb-1 integrated luminosity of proton-antiproton collisions at s**(1/2)=1.96 TeV recorded by the CDF II experiment at the Tevatron. We find 268 (16) single (double) b-tagged candidate events, where 248 +/- 43 (14.4 +/- 2.7) are expected from standard model background processes. We place 95% confidence level upper limits on the Higgs boson production cross section for several Higgs boson masses ranging from 110 GeV/c2 to 140 GeV/c2. For a mass of 115 GeV/c2 the observed (expected) limit is 20.4 (14.2) times the standard model prediction.Comment: 8 pages, 2 figures, submitted to Phys. Rev. Let

    A search for resonant production of ttˉt\bar{t} pairs in $4.8\ \rm{fb}^{-1}ofintegratedluminosityof of integrated luminosity of p\bar{p}collisionsat collisions at \sqrt{s}=1.96\ \rm{TeV}$

    Get PDF
    We search for resonant production of tt pairs in 4.8 fb^{-1} integrated luminosity of ppbar collision data at sqrt{s}=1.96 TeV in the lepton+jets decay channel, where one top quark decays leptonically and the other hadronically. A matrix element reconstruction technique is used; for each event a probability density function (pdf) of the ttbar candidate invariant mass is sampled. These pdfs are used to construct a likelihood function, whereby the cross section for resonant ttbar production is estimated, given a hypothetical resonance mass and width. The data indicate no evidence of resonant production of ttbar pairs. A benchmark model of leptophobic Z \rightarrow ttbar is excluded with m_{Z'} < 900 GeV at 95% confidence level.Comment: accepted for publication in Physical Review D Sep 21, 201

    Observation and Mass Measurement of the Baryon Ξb\Xi^-_b

    Get PDF
    We report the observation and measurement of the mass of the bottom, strange baryon Ξb\Xi^-_b through the decay chain ΞbJ/ψΞ\Xi^-_b \to J/\psi \Xi^-, where J/ψμ+μJ/\psi \to \mu^+ \mu^-, ΞΛπ\Xi^- \to \Lambda \pi^-, and Λpπ\Lambda \to p \pi^-. Evidence for observation is based on a signal whose probability of arising from the estimated background is 6.6 x 10^{-15}, or 7.7 Gaussian standard deviations. The Ξb\Xi^-_b mass is measured to be 5792.9±2.55792.9\pm 2.5 (stat.) ±1.7\pm 1.7 (syst.) MeV/c2c^2.Comment: Minor text changes for the second version. Accepted by Phys. Rev. Let
    corecore