238 research outputs found
Recommended from our members
Assurance of data integrity in petabyte data samples
We present a method for clustering data from high energy physics exper- iments on physical media. Data clustering based upon physics information can lead to large gains in access speed. However such clustering also increases vulnerability to data loss as any loss of a physical data store removes a data sample which has special physics characteristics. Measurements made with samples biased by losses of clustered data must be corrected for the e#11;ects of the loss. We discuss several methods for performing such corrections
DUNE Computing Tutorials
Providing computing training to the next generation of physicists is the principal driver for a biannual multi-day training workshop hosted by the DUNE Computing Consortium. Materials are cast in a Software Carpentry’s template, and topics have included storage space, data management, LArSoft, grid job submission and monitoring. Moreover, experts provide extended breakout sessions to demonstrate the fundamentals of the unique software used in HEP analysis. Each session uses live documents for real time correspondence, and are captured on Zoom; afterwards, videos are embedded on the corresponding web-pages for review. As a GitHub repository, shared editing of the learning modules is straightforward, and provides a trusted framework to extend to other training topics in the future. An overview of the tutorials as well as the machinery used, along with survey statistics and lessons learned is presented
Recommended from our members
Measurement of the muon antineutrino double-differential cross section for quasielastic-like scattering on hydrocarbon at Eν ∼ 3.5 GeV
We present double-differential measurements of antineutrino charged-current quasielastic scattering in the MINERvA detector. This study improves on a previous single-differential measurement by using updated reconstruction algorithms and interaction models and provides a complete description of observed muon kinematics in the form of a double-differential cross section with respect to muon transverse and longitudinal momentum. We include in our signal definition zero-meson final states arising from multinucleon interactions and from resonant pion production followed by pion absorption in the primary nucleus. We find that model agreement is considerably improved by a model tuned to MINERvA inclusive neutrino scattering data that incorporates nuclear effects such as weak nuclear screening and two-particle, two-hole enhancements
The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe
The preponderance of matter over antimatter in the early Universe, the
dynamics of the supernova bursts that produced the heavy elements necessary for
life and whether protons eventually decay --- these mysteries at the forefront
of particle physics and astrophysics are key to understanding the early
evolution of our Universe, its current state and its eventual fate. The
Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed
plan for a world-class experiment dedicated to addressing these questions. LBNE
is conceived around three central components: (1) a new, high-intensity
neutrino source generated from a megawatt-class proton accelerator at Fermi
National Accelerator Laboratory, (2) a near neutrino detector just downstream
of the source, and (3) a massive liquid argon time-projection chamber deployed
as a far detector deep underground at the Sanford Underground Research
Facility. This facility, located at the site of the former Homestake Mine in
Lead, South Dakota, is approximately 1,300 km from the neutrino source at
Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino
charge-parity symmetry violation and mass ordering effects. This ambitious yet
cost-effective design incorporates scalability and flexibility and can
accommodate a variety of upgrades and contributions. With its exceptional
combination of experimental configuration, technical capabilities, and
potential for transformative discoveries, LBNE promises to be a vital facility
for the field of particle physics worldwide, providing physicists from around
the globe with opportunities to collaborate in a twenty to thirty year program
of exciting science. In this document we provide a comprehensive overview of
LBNE's scientific objectives, its place in the landscape of neutrino physics
worldwide, the technologies it will incorporate and the capabilities it will
possess.Comment: Major update of previous version. This is the reference document for
LBNE science program and current status. Chapters 1, 3, and 9 provide a
comprehensive overview of LBNE's scientific objectives, its place in the
landscape of neutrino physics worldwide, the technologies it will incorporate
and the capabilities it will possess. 288 pages, 116 figure
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
This paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists.
The Deep Underground Neutrino Experiment (DUNE) collaboration consists of over 180 institutions from 33 countries. The experiment is in preparation now, with commissioning of the first 10kT fiducial volume Liquid Argon TPC expected over the period 2025-2028 and a long data taking run with 4 modules expected from 2029 and beyond.
An active prototyping program is already in place with a short test-beam run with a 700T, 15,360 channel prototype of single-phase readout at the Neutrino Platform at CERN in late 2018 and tests of a similar sized dual-phase detector scheduled for mid-2019. The 2018 test-beam run was a valuable live test of our computing model. The detector produced raw data at rates of up to 2GB/s. These data were stored at full rate on tape at CERN and Fermilab and replicated at sites in the UK and Czech Republic. In total 1.2 PB of raw data from beam and cosmic triggers were produced and reconstructed during the six week testbeam run.
Baseline predictions for the full DUNE detector data, starting in the late 2020’s are 30-60 PB of raw data per year. In contrast to traditional HEP computational problems, DUNE’s Liquid Argon TPC data consist of simple but very large (many GB) 2D data objects which share many characteristics with astrophysical images. This presents opportunities to use advances in machine learning and pattern recognition as a frontier user of High Performance Computing facilities capable of massively parallel processing
Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
This paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists.
The Deep Underground Neutrino Experiment (DUNE) collaboration consists of over 180 institutions from 33 countries. The experiment is in preparation now, with commissioning of the first 10kT fiducial volume Liquid Argon TPC expected over the period 2025-2028 and a long data taking run with 4 modules expected from 2029 and beyond.
An active prototyping program is already in place with a short test-beam run with a 700T, 15,360 channel prototype of single-phase readout at the Neutrino Platform at CERN in late 2018 and tests of a similar sized dual-phase detector scheduled for mid-2019. The 2018 test-beam run was a valuable live test of our computing model. The detector produced raw data at rates of up to 2GB/s. These data were stored at full rate on tape at CERN and Fermilab and replicated at sites in the UK and Czech Republic. In total 1.2 PB of raw data from beam and cosmic triggers were produced and reconstructed during the six week testbeam run.
Baseline predictions for the full DUNE detector data, starting in the late 2020’s are 30-60 PB of raw data per year. In contrast to traditional HEP computational problems, DUNE’s Liquid Argon TPC data consist of simple but very large (many GB) 2D data objects which share many characteristics with astrophysical images. This presents opportunities to use advances in machine learning and pattern recognition as a frontier user of High Performance Computing facilities capable of massively parallel processing.</jats:p
- …