70 research outputs found

    A Search for Boosted Low Mass Resonances Decaying to the bb̅ Final State and Produced in Association with a Jet at √s = 13 TeV with the ATLAS Detector

    Get PDF
    A search in the high momentum regime for new resonances, produced in association with a jet, decaying into a pair of bottom quarks is presented using an integrated luminosity of 80.5 fb-1 of proton-proton collisions at center-of-mass energy √s = 13 TeV recorded by the ATLAS detector at the Large Hadron Collider. The search was performed for low mass resonances, including the Standard Model Higgs boson and leptophobic Z\u27 dark matter mediators, in the mass range of 100 GeV to 200 GeV. For the Standard Model Higgs boson, the observed signal strength is ÎŒH = 5.8 ± 3.1 (stat.) ± 1.9 (syst.) ± 1.7 (th.), which is consistent with the background-only hypothesis at 1.6 standard deviations. No evidence of a significant excess of events beyond the expected background is found and competitive limits on leptophobic Z\u27 boson axial-vector couplings to Standard Model quarks with democratic couplings to all quark generations are set for the mass range considered. The dominant background in this analysis is irreducible multijet events from QCD interactions, which I modeled using a parametric function that was robust to fitting bias and spurious signals

    Bayesian Methodologies with pyhf

    Full text link
    bayesian_pyhf is a Python package that allows for the parallel Bayesian and frequentist evaluation of multi-channel binned statistical models. The Python library pyhf is used to build such models according to the HistFactory framework and already includes many frequentist inference methodologies. The pyhf-built models are then used as data-generating model for Bayesian inference and evaluated with the Python library PyMC. Based on Monte Carlo Chain Methods, PyMC allows for Bayesian modelling and together with the arviz library offers a wide range of Bayesian analysis tools.Comment: 8 pages, 3 figures, 1 listing. Contribution to the Proceedings of the 26th International Conference on Computing In High Energy and Nuclear Physics (CHEP 2023

    Distributed statistical inference with pyhf enabled through funcX

    Full text link
    In High Energy Physics facilities that provide High Performance Computing environments provide an opportunity to efficiently perform the statistical inference required for analysis of data from the Large Hadron Collider, but can pose problems with orchestration and efficient scheduling. The compute architectures at these facilities do not easily support the Python compute model, and the configuration scheduling of batch jobs for physics often requires expertise in multiple job scheduling services. The combination of the pure-Python libraries pyhf and funcX reduces the common problem in HEP analyses of performing statistical inference with binned models, that would traditionally take multiple hours and bespoke scheduling, to an on-demand (fitting) "function as a service" that can scalably execute across workers in just a few minutes, offering reduced time to insight and inference. We demonstrate execution of a scalable workflow using funcX to simultaneously fit 125 signal hypotheses from a published ATLAS search for new physics using pyhf with a wall time of under 3 minutes. We additionally show performance comparisons for other physics analyses with openly published probability models and argue for a blueprint of fitting as a service systems at HPC centers.Comment: 9 pages, 1 figure, 2 listings, 1 table, submitted to the 25th International Conference on Computing in High Energy & Nuclear Physic

    Deep Learning for the Matrix Element Method

    Full text link
    Extracting scientific results from high-energy collider data involves the comparison of data collected from the experiments with synthetic data produced from computationally-intensive simulations. Comparisons of experimental data and predictions from simulations increasingly utilize machine learning (ML) methods to try to overcome these computational challenges and enhance the data analysis. There is increasing awareness about challenges surrounding interpretability of ML models applied to data to explain these models and validate scientific conclusions based upon them. The matrix element (ME) method is a powerful technique for analysis of particle collider data that utilizes an \textit{ab initio} calculation of the approximate probability density function for a collision event to be due to a physics process of interest. The ME method has several unique and desirable features, including (1) not requiring training data since it is an \textit{ab initio} calculation of event probabilities, (2) incorporating all available kinematic information of a hypothesized process, including correlations, without the need for feature engineering and (3) a clear physical interpretation in terms of transition probabilities within the framework of quantum field theory. These proceedings briefly describe an application of deep learning that dramatically speeds-up ME method calculations and novel cyberinfrastructure developed to execute ME-based analyses on heterogeneous computing platforms.Comment: 6 pages, 3 figures. Contribution to the Proceedings of the ICHEP 2022 Conferenc

    Reinterpretation and Long-Term Preservation of Data and Code

    Full text link
    Careful preservation of experimental data, simulations, analysis products, and theoretical work maximizes their long-term scientific return on investment by enabling new analyses and reinterpretation of the results in the future. Key infrastructure and technical developments needed for some high-value science targets are not in scope for the operations program of the large experiments and are often not effectively funded. Increasingly, the science goals of our projects require contributions that span the boundaries between individual experiments and surveys, and between the theoretical and experimental communities. Furthermore, the computational requirements and technical sophistication of this work is increasing. As a result, it is imperative that the funding agencies create programs that can devote significant resources to these efforts outside of the context of the operations of individual major experiments, including smaller experiments and theory/simulation work. In this Snowmass 2021 Computational Frontier topical group report (CompF7: Reinterpretation and long-term preservation of data and code), we summarize the current state of the field and make recommendations for the future.Comment: Snowmass 2021 Computational Frontier CompF7 Reinterpretation and long-term preservation of data and code topical group repor

    Software Citation in HEP: Current State and Recommendations for the Future

    Full text link
    In November 2022, the HEP Software Foundation (HSF) and the Institute for Research and Innovation for Software in High-Energy Physics (IRIS-HEP) organized a workshop on the topic of Software Citation and Recognition in HEP. The goal of the workshop was to bring together different types of stakeholders whose roles relate to software citation and the associated credit it provides in order to engage the community in a discussion on: the ways HEP experiments handle citation of software, recognition for software efforts that enable physics results disseminated to the public, and how the scholarly publishing ecosystem supports these activities. Reports were given from the publication board leadership of the ATLAS, CMS, and LHCb experiments and HEP open source software community organizations (ROOT, Scikit-HEP, MCnet), and perspectives were given from publishers (Elsevier, JOSS) and related tool providers (INSPIRE, Zenodo). This paper summarizes key findings and recommendations from the workshop as presented at the 26th International Conference on Computing In High Energy and Nuclear Physics (CHEP 2023).Comment: 7 pages, 2 listings. Contribution to the Proceedings of the 26th International Conference on Computing In High Energy and Nuclear Physics (CHEP 2023

    The Scikit HEP Project -- overview and prospects

    Full text link
    Scikit-HEP is a community-driven and community-oriented project with the goal of providing an ecosystem for particle physics data analysis in Python. Scikit-HEP is a toolset of approximately twenty packages and a few "affiliated" packages. It expands the typical Python data analysis tools for particle physicists. Each package focuses on a particular topic, and interacts with other packages in the toolset, where appropriate. Most of the packages are easy to install in many environments; much work has been done this year to provide binary "wheels" on PyPI and conda-forge packages. The Scikit-HEP project has been gaining interest and momentum, by building a user and developer community engaging collaboration across experiments. Some of the packages are being used by other communities, including the astroparticle physics community. An overview of the overall project and toolset will be presented, as well as a vision for development and sustainability.Comment: 6 pages, 3 figures, Proceedings of the 24th International Conference on Computing in High Energy and Nuclear Physics (CHEP 2019), Adelaide, Australia, 4-8 November 201

    Data Science and Machine Learning in Education

    Full text link
    The growing role of data science (DS) and machine learning (ML) in high-energy physics (HEP) is well established and pertinent given the complex detectors, large data, sets and sophisticated analyses at the heart of HEP research. Moreover, exploiting symmetries inherent in physics data have inspired physics-informed ML as a vibrant sub-field of computer science research. HEP researchers benefit greatly from materials widely available materials for use in education, training and workforce development. They are also contributing to these materials and providing software to DS/ML-related fields. Increasingly, physics departments are offering courses at the intersection of DS, ML and physics, often using curricula developed by HEP researchers and involving open software and data used in HEP. In this white paper, we explore synergies between HEP research and DS/ML education, discuss opportunities and challenges at this intersection, and propose community activities that will be mutually beneficial.Comment: Contribution to Snowmass 202

    Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2

    Get PDF
    We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum. We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future. We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data
    • 

    corecore