35 research outputs found

    Neutrinos from Stored Muons nuSTORM: Expression of Interest

    Get PDF
    The nuSTORM facility has been designed to deliver beams of electron and muon neutrinos from the decay of a stored muon beam with a central momentum of 3.8 GeV/c and a momentum spread of 10%. The facility is unique in that it will: serve the future long- and short-baseline neutrino-oscillation programmes by providing definitive measurements of electron-neutrino- and muon-neutrino-nucleus cross sections with percent-level precision; allow searches for sterile neutrinos of exquisite sensitivity to be carried out; and constitute the essential first step in the incremental development of muon accelerators as a powerful new technique for particle physics. Of the world's proton-accelerator laboratories, only CERN and FNAL have the infrastructure required to mount nuSTORM. Since no siting decision has yet been taken, the purpose of this Expression of Interest (EoI) is to request the resources required to: investigate in detail how nuSTORM could be implemented at CERN; and develop options for decisive European contributions to the nuSTORM facility and experimental programme wherever the facility is sited. The EoI defines a two-year programme culminating in the delivery of a Technical Design Report

    Deriving Particle Distributions from In-Line Fraunhofer Holographic Data

    No full text
    Abstract #/f K m Holographic data are acquired during hydrodynamic experiments at the Pegasus Pulsed Power Facility at the Los Alamos National Laboratory. These experiments produce a fine spray of fast-moving particles. Snapshots of the spray are captured using in-line Fraunhofer holographic techniques. Roughly one cubic centimeter is recorded by the hologram. Minimum detectable particle size in the data extends down to 2 microns. In a holography reconstmction system, a laser illuminates the hologram as it rests in a three-axis actuator, recreating the snapshot of the experiment. A computer guides the actuators through an orderly sequence programmed by the user. At selected intervals, slices of this volume are captured and digitized with a CCD camera. Intermittent on-line processing of the image data and computer control of the camera iimctions optimizes statistics of the acquired image data for off-line processing. Tens of thousands of individual data frames (3 Oto 40 gigabytes of data) are required to recreate a digital representation of the snapshot. Throughput of the reduction system is 550 megabytes per hour (MWhr). Objects and associated features from the data are subsequently extracted during off-line processing. Discrimination and correlation tests reject noise, eliminate muiitplecounting of particles, and build an error model to estimate performance. Objects surviving these tests are classified as particles. The particle distributions are derived from the data base formed by these particles, their locations and features. Throughput of the off-line processing exceeds 500 MB/hr. This paper describes the reduction system, outlines the off-line processing procedure, summarizes the discrimination and correlation tests, and reports numerical results for a sample data set
    corecore