2,451 research outputs found

    FPGA-Based Tracklet Approach to Level-1 Track Finding at CMS for the HL-LHC

    Full text link
    During the High Luminosity LHC, the CMS detector will need charged particle tracking at the hardware trigger level to maintain a manageable trigger rate and achieve its physics goals. The tracklet approach is a track-finding algorithm based on a road-search algorithm that has been implemented on commercially available FPGA technology. The tracklet algorithm has achieved high performance in track-finding and completes tracking within 3.4 ÎĽ\mus on a Xilinx Virtex-7 FPGA. An overview of the algorithm and its implementation on an FPGA is given, results are shown from a demonstrator test stand and system performance studies are presented.Comment: Submitted to proceedings of Connecting The Dots/Intelligent Trackers 2017, Orsay, Franc

    Using a Neural Network to Approximate the Negative Log Likelihood Function

    Get PDF
    An increasingly frequent challenge faced in HEP data analysis is to characterize the agreement between a prediction that depends on a dozen or more model parameters—such as predictions coming from an effective field theory (EFT) framework—and the observed data. Traditionally, such characterizations take the form of a negative log likelihood (NLL) function, which can only be evaluated numerically. The lack of a closed-form description of the NLL function makes it difficult to convey results of the statistical analysis. Typical results are limited to extracting “best fit” values of the model parameters and 1D intervals or 2D contours extracted from scanning the higher dimensional parameter space. It is desirable to explore these high-dimensional model parameter spaces in more sophisticated ways. One option for overcoming this challenge is to use a neural network to approximate the NLL function. This approach has the advantage of being continuous and differentiable by construction, which are essential properties for an NLL function and may also provide useful handles in exploring the NLL as a function of the model parameters. In this talk, we describe the advantages and limitations of this approach in the context of applying it to a CMS data analysis using the framework of EFT

    Secondary hyperparathyroidism: Predictors and relationship with vitamin D status, bone turnover markers and bone mineral density

    Get PDF
    Secondary hyperparathyroidism (SHPT) has adverse implications for bone health but is relatively understudied. In this study we examine the prevalence and determinants of SHPT and describe the relationship of SHPT with bone turnover markers and bone mineral density (BMD) in older Irish adults. Eligible participants (n = 4139) were identified from the Trinity-Ulster-Department of Agriculture (TUDA) study, a cohort of Irish adults aged ≥60 years. Exclusion criteria included an estimated glomerular filtration rate (eGFR) 2.5 mmol/l to remove hyperparathyroidism due to advanced chronic kidney disease (CKD) and primary hyperparathyroidism respectively. The relationship between SHPT and bone turnover markers and BMD (measured by densitometry) was examined in a subsample (n = 1488). Vitamin D deficiency was defined as 25-hydroxyvitamin D [25 (OH)D

    The U.S. CMS HL-LHC R&D Strategic Plan

    Full text link
    The HL-LHC run is anticipated to start at the end of this decade and will pose a significant challenge for the scale of the HEP software and computing infrastructure. The mission of the U.S. CMS Software & Computing Operations Program is to develop and operate the software and computing resources necessary to process CMS data expeditiously and to enable U.S. physicists to fully participate in the physics of CMS. We have developed a strategic plan to prioritize R&D efforts to reach this goal for the HL-LHC. This plan includes four grand challenges: modernizing physics software and improving algorithms, building infrastructure for exabyte-scale datasets, transforming the scientific data analysis process and transitioning from R&D to operations. We are involved in a variety of R&D projects that fall within these grand challenges. In this talk, we will introduce our four grand challenges and outline the R&D program of the U.S. CMS Software & Computing Operations Program.Comment: CHEP2023 proceedings, to be published in EPJ Web of Conference

    Measurements of the Production, Decay and Properties of the Top Quark: A Review

    Get PDF
    With the full Tevatron Run II and early LHC data samples, the opportunity for furthering our understanding of the properties of the top quark has never been more promising. Although the current knowledge of the top quark comes largely from Tevatron measurements, the experiments at the LHC are poised to probe top-quark production and decay in unprecedented regimes. Although no current top quark measurements conclusively contradict predictions from the standard model, the precision of most measurements remains statistically limited. Additionally, some measurements, most notably the forward-backward asymmetry in top quark pair production, show tantalizing hints of beyond-the-Standard-Model dynamics. The top quark sample is growing rapidly at the LHC, with initial results now public. This review examines the current status of top quark measurements in the particular light of searching for evidence of new physics, either through direct searches for beyond the standard model phenomena or indirectly via precise measurements of standard model top quark properties

    The U.S. CMS HL-LHC R&D Strategic Plan

    Get PDF
    The HL-LHC run is anticipated to start at the end of this decade and will pose a significant challenge for the scale of the HEP software and computing infrastructure. The mission of the U.S. CMS Software & Computing Operations Program is to develop and operate the software and computing resources necessary to process CMS data expeditiously and to enable U.S. physicists to fully participate in the physics of CMS. We have developed a strategic plan to prioritize R&D efforts to reach this goal for the HL-LHC. This plan includes four grand challenges: modernizing physics software and improving algorithms, building infrastructure for exabyte-scale datasets, transforming the scientific data analysis process and transitioning from R&D to operations. We are involved in a variety of R&D projects that fall within these grand challenges. In this talk, we will introduce our four grand challenges and outline the R&D program of the U.S. CMS Software & Computing Operations Program
    • …
    corecore