13 research outputs found

    Design of the Magnet System of the Neutron Decay Facility PERC

    Full text link
    The PERC (Proton and Electron Radiation Channel) facility is currently under construction at the research reactor FRM II, Garching. It will serve as an intense and clean source of electrons and protons from neutron beta decay for precision studies. It aims to contribute to the determination of the Cabibbo-Kobayashi-Maskawa quark-mixing element VudV_{ud} from neutron decay data and to search for new physics via new effective couplings. PERC's central component is a 12m long superconducting magnet system. It hosts an 8m long decay region in a uniform field. An additional high-field region selects the phase space of electrons and protons which can reach the detectors and largely improves systematic uncertainties. We discuss the design of the magnet system and the resulting properties of the magnetic field.Comment: Proceedings of the International Workshop on Particle Physics at Neutron Sources PPNS 2018, Grenoble, France, May 24-26, 201

    TomOpt: Muon Tomography experiment optimization

    No full text
    International audienceThe Tomopt software is a tool to optimize the geometrical layout and specifications of detectors designed for muon scattering tomography. Based on differentiable programming techniques, Tomopt consists in a modular pipeline that models all the aspects of a muon tomography task, from the generation and interaction of cosmic ray muons with a parameterized detector and passive material, to the inference on the volume properties. This enables the optimization of the detector parameters via gradient descent, to suggest optimal detector configurations and specifications. This optimisation is subjected to various external constraints such as cost, logistic and material identification efficiency

    TomOpt: Differential optimisation for task- and constraint-aware design of particle detectors in the context of muon tomography

    No full text
    International audienceWe describe a software package, TomOpt, developed to optimise the geometrical layout and specifications of detectors designed for tomography by scattering of cosmic-ray muons. The software exploits differentiable programming for the modeling of muon interactions with detectors and scanned volumes, the inference of volume properties, and the optimisation cycle performing the loss minimisation. In doing so, we provide the first demonstration of end-to-end-differentiable and inference-aware optimisation of particle physics instruments. We study the performance of the software on a relevant benchmark scenarios and discuss its potential applications

    TomOpt: Differential optimisation for task- and constraint-aware design of particle detectors in the context of muon tomography

    No full text
    International audienceWe describe a software package, TomOpt, developed to optimise the geometrical layout and specifications of detectors designed for tomography by scattering of cosmic-ray muons. The software exploits differentiable programming for the modeling of muon interactions with detectors and scanned volumes, the inference of volume properties, and the optimisation cycle performing the loss minimisation. In doing so, we provide the first demonstration of end-to-end-differentiable and inference-aware optimisation of particle physics instruments. We study the performance of the software on a relevant benchmark scenarios and discuss its potential applications

    TomOpt: Differential optimisation for task- and constraint-aware design of particle detectors in the context of muon tomography

    No full text
    International audienceWe describe a software package, TomOpt, developed to optimise the geometrical layout and specifications of detectors designed for tomography by scattering of cosmic-ray muons. The software exploits differentiable programming for the modeling of muon interactions with detectors and scanned volumes, the inference of volume properties, and the optimisation cycle performing the loss minimisation. In doing so, we provide the first demonstration of end-to-end-differentiable and inference-aware optimisation of particle physics instruments. We study the performance of the software on a relevant benchmark scenarios and discuss its potential applications

    Design of the magnet system of the neutron decay facility PERC

    Get PDF
    The PERC (Proton and Electron Radiation Channel) facility is currently under construction at the research reactor FRM II, Garching. It will serve as an intense and clean source of electrons and protons from neutron beta decay for precision studies. It aims to contribute to the determination of the Cabibbo-Kobayashi-Maskawa quark-mixing element Vud from neutron decay data and to search for new physics via new effective couplings. PERC's central component is a 12 m long superconducting magnet system. It hosts an 8 m long decay region in a uniform field. An additional high-field region selects the phase space of electrons and protons which can reach the detectors and largely improves systematic uncertainties. We discuss the design of the magnet system and the resulting properties of the magnetic field

    Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper

    No full text
    The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, "experience-driven" layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications

    Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper

    No full text
    The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, "experience-driven" layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications

    Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper

    Get PDF
    The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, "experience-driven" layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications

    Toward the End-to-End Optimization of Particle Physics Instruments with Differentiable Programming: a White Paper

    No full text
    The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, data-acquisition, and information-extraction techniques, and the interdependence of the related parameters. On the other hand, massive potential gains in performance over standard, 'experience-driven' layouts are in principle within our reach if an objective function fully aligned with the final goals of the instrument is maximized by means of a systematic search of the configuration space. The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters. In this document we lay down our plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient. We consider a selected set of use cases to highlight the specific needs of different applications
    corecore