25 research outputs found
MICE: the Muon Ionization Cooling Experiment. Step I: First Measurement of Emittance with Particle Physics Detectors
The Muon Ionization Cooling Experiment (MICE) is a strategic R&D project intended to demonstrate the only practical solution to providing high brilliance beams necessary for a neutrino factory or muon collider. MICE is under development at the Rutherford Appleton Laboratory (RAL) in the United Kingdom. It comprises a dedicated beamline to generate a range of input muon emittances and momenta, with time-of-flight and Cherenkov detectors to ensure a pure muon beam. The emittance of the incoming beam will be measured in the upstream magnetic spectrometer with a scintillating fiber tracker. A cooling cell will then follow, alternating energy loss in Liquid Hydrogen (LH2) absorbers to RF cavity acceleration. A second spectrometer, identical to the first, and a second muon identification system will measure the outgoing emittance. In the 2010 run at RAL the muon beamline and most detectors were fully commissioned and a first measurement of the emittance of the muon beam with particle physics (time-of-flight) detectors was performed. The analysis of these data was recently completed and is discussed in this paper. Future steps for MICE, where beam emittance and emittance reduction (cooling) are to be measured with greater accuracy, are also presented
Pion contamination in the MICE muon beam
The international Muon Ionization Cooling Experiment (MICE) will perform a systematic investigation of ionization cooling with muon beams of momentum between 140 and 240\,MeV/c at the Rutherford Appleton Laboratory ISIS facility. The measurement of ionization cooling in MICE relies on the selection of a pure sample of muons that traverse the experiment. To make this selection, the MICE Muon Beam is designed to deliver a beam of muons with less than 1\% contamination. To make the final muon selection, MICE employs a particle-identification (PID) system upstream and downstream of the cooling cell. The PID system includes time-of-flight hodoscopes, threshold-Cherenkov counters and calorimetry. The upper limit for the pion contamination measured in this paper is at 90\% C.L., including systematic uncertainties. Therefore, the MICE Muon Beam is able to meet the stringent pion-contamination requirements of the study of ionization cooling.Department of Energy and National Science Foundation (U.S.A.), the Instituto Nazionale di Fisica Nucleare (Italy), the Science and Technology Facilities Council (U.K.), the European Community under the European Commission Framework Programme 7 (AIDA project, grant agreement no. 262025, TIARA project, grant agreement no. 261905, and EuCARD), the Japan Society for the Promotion of Science and the Swiss National Science Foundation, in the framework of the SCOPES programme
Measurement of the νe and total 8B solar neutrino fluxes with the Sudbury Neutrino Observatory phase-III data set
This paper details the solar neutrino analysis of the 385.17-day phase-III data set acquired by the Sudbury Neutrino Observatory (SNO). An array of 3He proportional counters was installed in the heavy-water target to measure precisely the rate of neutrino-deuteron neutral-current interactions. This technique to determine the total active 8B solar neutrino flux was largely independent of the methods employed in previous phases. The total flux of active neutrinos was measured to be 5.54-0.31+0.33(stat.)-0.34+0.36(syst.)×106 cm-2 s-1, consistent with previous measurements and standard solar models. A global analysis of solar and reactor neutrino mixing parameters yielded the best-fit values of Δm2=7.59-0.21+0.19×10 -5eV2 and θ=34.4-1.2+1.3degrees
Agile Scrum Development in an ad hoc Software Collaboration
Developing sustainable scientific software for the needs of the scientific community requires expertise in both software engineering and domain science. This can be challenging due to the unique needs of scientific software, the insufficient resources for modern software engineering practices in the scientific community, and the complexity of evolving scientific contexts for developers. These difficulties can be reduced if scientists and developers collaborate. We present a case study wherein scientists from the SuperNova Early Warning System collaborated with software developers from the Scalable Cyberinfrastructure for Multi-Messenger Astrophysics project. The collaboration addressed the difficulties of scientific software development, but presented additional risks to each team. For the scientists, there was a concern of relying on external systems and lacking control in the development process. For the developers, there was a risk in supporting the needs of an user-group while maintaining core development. We mitigated these issues by utilizing an Agile Scrum framework to orchestrate the collaboration. This promoted communication and cooperation, ensuring that the scientists had an active role in development while allowing the developers to quickly evaluate and implement the scientists' software requirements. While each system was still in an early stage, the collaboration provided benefits for each group: the scientists kick-started their development by using an existing platform, and the developers utilized the scientists' use-case to improve their systems. This case study suggests that scientists and software developers can avoid some difficulties of scientific computing by collaborating and can address emergent concerns using Agile Scrum methods
MAUS: the MICE analysis user software
Science and Technology Facilities Council (U.K.); The European Commission Framework Programme
SNEWS 2.0: a next-generation supernova early warning system for multi-messenger astronomy
International audienceThe next core-collapse supernova in the Milky Way or its satellites will represent a once-in-a-generation opportunity to obtain detailed information about the explosion of a star and provide significant scientific insight for a variety of fields because of the extreme conditions found within. Supernovae in our galaxy are not only rare on a human timescale but also happen at unscheduled times, so it is crucial to be ready and use all available instruments to capture all possible information from the event. The first indication of a potential stellar explosion will be the arrival of a bright burst of neutrinos. Its observation by multiple detectors worldwide can provide an early warning for the subsequent electromagnetic fireworks, as well as signal to other detectors with significant backgrounds so they can store their recent data. The supernova early warning system (SNEWS) has been operating as a simple coincidence between neutrino experiments in automated mode since 2005. In the current era of multi-messenger astronomy there are new opportunities for SNEWS to optimize sensitivity to science from the next galactic supernova beyond the simple early alert. This document is the product of a workshop in June 2019 towards design of SNEWS 2.0, an upgraded SNEWS with enhanced capabilities exploiting the unique advantages of prompt neutrino detection to maximize the science gained from such a valuable event
GPU-based optical simulation of the DARWIN detector
Understanding propagation of scintillation light is critical for maximizing the discovery potential of next-generation liquid xenon detectors that use dual-phase time projection chamber technology. This work describes a detailed optical simulation of the DARWIN detector implemented using Chroma, a GPU-based photon tracking framework. To evaluate the framework and to explore ways of maximizing efficiency and minimizing the time of light collection, we simulate several variations of the conventional detector design. Results of these selected studies are presented. More generally, we conclude that the approach used in this work allows one to investigate alternative designs faster and in more detail than using conventional Geant4 optical simulations, making it an attractive tool to guide the development of the ultimate liquid xenon observatory
The XENON1T data acquisition system
The XENON1T liquid xenon time projection chamber is the most sensitive detector built to date for the measurement of direct interactions of weakly interacting massive particles with normal matter. The data acquisition system (DAQ) is constructed from commercial, open source, and custom components to digitize signals from the detector and store them for later analysis. The system achieves an extremely low signal threshold by triggering each channel independently, achieving a single photoelectron acceptance of (93 \ub1 3)%, and deferring the global trigger to a later, software stage. The event identification is based on MongoDB database queries and has over 98% efficiency at recognizing interactions at the analysis threshold in the center of the target. A readout bandwidth over 300 MB/s is reached in calibration modes and is further expandable via parallelization. This DAQ system was successfully used during three years of operation of XENON1T
XENONnT WIMP Search: Signal & Background Modeling and Statistical Inference
International audienceThe XENONnT experiment searches for weakly-interacting massive particle (WIMP) dark matter scattering off a xenon nucleus. In particular, XENONnT uses a dual-phase time projection chamber with a 5.9-tonne liquid xenon target, detecting both scintillation and ionization signals to reconstruct the energy, position, and type of recoil. A blind search for nuclear recoil WIMPs with an exposure of 1.1 tonne-years yielded no signal excess over background expectations, from which competitive exclusion limits were derived on WIMP-nucleon elastic scatter cross sections, for WIMP masses ranging from 6 GeV/ up to the TeV/ scale. This work details the modeling and statistical methods employed in this search. By means of calibration data, we model the detector response, which is then used to derive background and signal models. The construction and validation of these models is discussed, alongside additional purely data-driven backgrounds. We also describe the statistical inference framework, including the definition of the likelihood function and the construction of confidence intervals