2,342 research outputs found
Software Citation in HEP: Current State and Recommendations for the Future
In November 2022, the HEP Software Foundation (HSF) and the Institute for
Research and Innovation for Software in High-Energy Physics (IRIS-HEP)
organized a workshop on the topic of Software Citation and Recognition in HEP.
The goal of the workshop was to bring together different types of stakeholders
whose roles relate to software citation and the associated credit it provides
in order to engage the community in a discussion on: the ways HEP experiments
handle citation of software, recognition for software efforts that enable
physics results disseminated to the public, and how the scholarly publishing
ecosystem supports these activities. Reports were given from the publication
board leadership of the ATLAS, CMS, and LHCb experiments and HEP open source
software community organizations (ROOT, Scikit-HEP, MCnet), and perspectives
were given from publishers (Elsevier, JOSS) and related tool providers
(INSPIRE, Zenodo). This paper summarizes key findings and recommendations from
the workshop as presented at the 26th International Conference on Computing In
High Energy and Nuclear Physics (CHEP 2023).Comment: 7 pages, 2 listings. Contribution to the Proceedings of the 26th
International Conference on Computing In High Energy and Nuclear Physics
(CHEP 2023
CMS Partial Releases: Model, Tools, and Applications. Online and Framework-Light Releases
The CMS Software project CMSSW embraces more than a thousand packages organized in subsystems for analysis, event display, reconstruction, simulation, detector description, data formats, framework, utilities and tools. The release integration process is highly automated by using tools developed or adopted by CMS. Packaging in rpm format is a built-in step in the software build process. For several well-defined applications it is highly desirable to have only a subset of the CMSSW full package bundle. For example, High Level Trigger algorithms that run on the Online farm, and need to be rebuilt in a special way, require no simulation, event display, or analysis packages. Physics analysis applications in Root environment require only a few core libraries and the description of CMS specific data formats. We present a model of CMS Partial Releases, used for preparation of the customized CMS software builds, including description of the tools used, the implementation, and how we deal with technical challenges, such as resolving dependencies and meeting special requirements for concrete applications in a highly automated fashion
Software Citation in HEP: Current State and Recommendations for the Future
In November 2022, the HEP Software Foundation and the Institute for Research and Innovation for Software in High-Energy Physics organized a workshop on the topic of Software Citation and Recognition in HEP. The goal of the workshop was to bring together different types of stakeholders whose roles relate to software citation, and the associated credit it provides, in order to engage the community in a discussion on: the ways HEP experiments handle citation of software, recognition for software efforts that enable physics results disseminated to the public, and how the scholarly publishing ecosystem supports these activities. Reports were given from the publication board leadership of the ATLAS, CMS, and LHCb experiments and HEP open source software community organizations (ROOT, Scikit-HEP, MCnet), and perspectives were given from publishers (Elsevier, JOSS) and related tool providers (INSPIRE, Zenodo). This paper summarizes key findings and recommendations from the workshop as presented at the 26th International Conference on Computing in High Energy and Nuclear Physics (CHEP 2023)
The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe
The preponderance of matter over antimatter in the early Universe, the
dynamics of the supernova bursts that produced the heavy elements necessary for
life and whether protons eventually decay --- these mysteries at the forefront
of particle physics and astrophysics are key to understanding the early
evolution of our Universe, its current state and its eventual fate. The
Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed
plan for a world-class experiment dedicated to addressing these questions. LBNE
is conceived around three central components: (1) a new, high-intensity
neutrino source generated from a megawatt-class proton accelerator at Fermi
National Accelerator Laboratory, (2) a near neutrino detector just downstream
of the source, and (3) a massive liquid argon time-projection chamber deployed
as a far detector deep underground at the Sanford Underground Research
Facility. This facility, located at the site of the former Homestake Mine in
Lead, South Dakota, is approximately 1,300 km from the neutrino source at
Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino
charge-parity symmetry violation and mass ordering effects. This ambitious yet
cost-effective design incorporates scalability and flexibility and can
accommodate a variety of upgrades and contributions. With its exceptional
combination of experimental configuration, technical capabilities, and
potential for transformative discoveries, LBNE promises to be a vital facility
for the field of particle physics worldwide, providing physicists from around
the globe with opportunities to collaborate in a twenty to thirty year program
of exciting science. In this document we provide a comprehensive overview of
LBNE's scientific objectives, its place in the landscape of neutrino physics
worldwide, the technologies it will incorporate and the capabilities it will
possess.Comment: Major update of previous version. This is the reference document for
LBNE science program and current status. Chapters 1, 3, and 9 provide a
comprehensive overview of LBNE's scientific objectives, its place in the
landscape of neutrino physics worldwide, the technologies it will incorporate
and the capabilities it will possess. 288 pages, 116 figure
Measurement of the Lifetime Difference Between B_s Mass Eigenstates
We present measurements of the lifetimes and polarization amplitudes for B_s
--> J/psi phi and B_d --> J/psi K*0 decays. Lifetimes of the heavy (H) and
light (L) mass eigenstates in the B_s system are separately measured for the
first time by determining the relative contributions of amplitudes with
definite CP as a function of the decay time. Using 203 +/- 15 B_s decays, we
obtain tau_L = (1.05 +{0.16}/-{0.13} +/- 0.02) ps and tau_H = (2.07
+{0.58}/-{0.46} +/- 0.03) ps. Expressed in terms of the difference DeltaGamma_s
and average Gamma_s, of the decay rates of the two eigenstates, the results are
DeltaGamma_s/Gamma_s = (65 +{25}/-{33} +/- 1)%, and DeltaGamma_s = (0.47
+{0.19}/-{0.24} +/- 0.01) inverse ps.Comment: 8 pages, 3 figures, 2 tables; as published in Physical Review Letters
on 16 March 2005; revisions are for length and typesetting only, no changes
in results or conclusion
HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation
At the heart of experimental high energy physics (HEP) is the development of
facilities and instrumentation that provide sensitivity to new phenomena. Our
understanding of nature at its most fundamental level is advanced through the
analysis and interpretation of data from sophisticated detectors in HEP
experiments. The goal of data analysis systems is to realize the maximum
possible scientific potential of the data within the constraints of computing
and human resources in the least time. To achieve this goal, future analysis
systems should empower physicists to access the data with a high level of
interactivity, reproducibility and throughput capability. As part of the HEP
Software Foundation Community White Paper process, a working group on Data
Analysis and Interpretation was formed to assess the challenges and
opportunities in HEP data analysis and develop a roadmap for activities in this
area over the next decade. In this report, the key findings and recommendations
of the Data Analysis and Interpretation Working Group are presented.Comment: arXiv admin note: text overlap with arXiv:1712.0659
Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum. We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future. We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
- …