18 research outputs found

    A Study of Electron Energy Response Measurement in the NOvA Test Beam Detector

    Get PDF
    The NuMI Off-axis electron-neutrino Appearance (NOvA) experiment at Fermilab is a long-baseline accelerator neutrino experiment designed to study and understand neutrinos through their flavor oscillations between two functionally-identical detectors; a 300-ton Near Detector and a 14 kton Far Detector separated by 809 km and placed 14 mrad off-axis to the Neutrinos at the Main Injector (NuMI) neutrino beam produced at Fermi National Accelerator Laboratory (Fermilab). NOvA has key physics goals to determine the neutrino mass hierarchy, probe CP violation in the leptonic sector, and conduct precise measurements of the neutrino mixing parameters. To help further NOvA’s physics reach, the NOvA Test Beam program operates a scaled-down30-ton detector to measure charged particles found in the final state of neutrino interactions including electrons, muons, pions, kaons, and protons. These particles are identified and momentum-selected within a range of 0.3 to 2.0 GeV/c by a new tertiary beamline deployed at Fermilab. The Test Beam program will provide NOvA with an improved understanding of the largest systematic uncertainties impacting NOvA’s oscillation analyses including detector response and detector calibration. This thesis presents the current status of the NOvA Test Beam program and discusses the tagging of electrons and positrons with the beamline data together with preliminary results from detector measurements of their electromagnetic activity and analysis of the electronic properties. Discrepancy seen in the energy variables between data and simulation suggests inaccurate detector calibration. Fake data study of the effect of reduced calibration systematics hints at increased accuracy in the measurement of neutrino oscillation parameters

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems that facilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment.This document describes the conceptual design for the Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE). The goals of the experiment include 1) studying neutrino oscillations using a beam of neutrinos sent from Fermilab in Illinois to the Sanford Underground Research Facility (SURF) in Lead, South Dakota, 2) studying astrophysical neutrino sources and rare processes and 3) understanding the physics of neutrino interactions in matter. We describe the development of the computing infrastructure needed to achieve the physics goals of the experiment by storing, cataloging, reconstructing, simulating, and analyzing ∌\sim 30 PB of data/year from DUNE and its prototypes. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions and advanced algorithms as HEP computing evolves. We describe the physics objectives, organization, use cases, and proposed technical solutions

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora