6 research outputs found
Calibration of the Gamma-RAy Polarimeter Experiment (GRAPE) at a Polarized Hard X-Ray Beam
The Gamma-RAy Polarimeter Experiment (GRAPE) is a concept for an astronomical
hard X-ray Compton polarimeter operating in the 50 - 500 keV energy band. The
instrument has been optimized for wide-field polarization measurements of
transient outbursts from energetic astrophysical objects such as gamma-ray
bursts and solar flares. The GRAPE instrument is composed of identical modules,
each of which consists of an array of scintillator elements read out by a
multi-anode photomultiplier tube (MAPMT). Incident photons Compton scatter in
plastic scintillator elements and are subsequently absorbed in inorganic
scintillator elements; a net polarization signal is revealed by a
characteristic asymmetry in the azimuthal scattering angles. We have
constructed a prototype GRAPE module containing a single CsI(Na) calorimeter
element, at the center of the MAPMT, surrounded by 60 plastic elements. The
prototype has been combined with custom readout electronics and software to
create a complete "engineering model" of the GRAPE instrument. This engineering
model has been calibrated using a nearly 100% polarized hard X-ray beam at the
Advanced Photon Source at Argonne National Laboratory. We find modulation
factors of 0.46 +/- 0.06 and 0.48 +/- 0.03 at 69.5 keV and 129.5 keV,
respectively, in good agreement with Monte Carlo simulations. In this paper we
present details of the beam test, data analysis, and simulations, and discuss
the implications of our results for the further development of the GRAPE
concept.Comment: 35 pages, 14 figures, accepted for publication in NIM-
Testing and simulation of silicon photomultiplier readouts for scintillators in high-energy astronomy and solar physics
Space-based gamma-ray detectors for high-energy astronomy and solar physics face severe constraints on mass, volume, and power, and must endure harsh launch conditions and operating environments. Historically, such instruments have usually been based on scintillator materials due to their relatively low cost, inherent ruggedness, high stopping power, and radiation hardness. New scintillator materials, such as LaBr3:Ce, feature improved energy and timing performance, making them attractive for future astronomy and solar physics space missions in an era of tightly constrained budgets. Despite this promise, the use of scintillators in space remains constrained by the volume, mass, power, and fragility of the associated light readout device, typically a vacuum photomultiplier tube (PMT). In recent years, silicon photomultipliers (SiPMs) have emerged as promising alternative light readout devices that offer gains and quantum efficiencies similar to those of PMTs, but with greatly reduced mass and volume, high ruggedness, low voltage requirements, and no sensitivity to magnetic fields. In order for SiPMs to replace PMTs in space-based instruments, however, it must be shown that they can provide comparable performance, and that their inherent temperature sensitivity can be corrected for. To this end, we have performed extensive testing and modeling of a small gamma-ray spectrometer composed of a 6 mm×6 mm SiPM coupled to a 6 mm×6 mm ×10 mm LaBr3:Ce crystal. A custom readout board monitors the temperature and adjusts the bias voltage to compensate for gain variations. We record an energy resolution of 5.7% (FWHM) at 662 keV at room temperature. We have also performed simulations of the scintillation process and optical light collection using Geant4, and of the SiPM response using the GosSiP package. The simulated energy resolution is in good agreement with the data from 22 keV to 662 keV. Above ~1 MeV, however, the measured energy resolution is systematically worse than the simulations. This discrepancy is likely due to the high input impedance of the readout board front-end electronics, which introduces a non-linear saturation effect in the SiPM for large light pulses. Analysis of the simulations indicates several additional steps that must be taken to optimize the energy resolution of SiPM-based scintillator detectors
Calibration of the Fast Neutron Imaging Telescope (FNIT) Prototype Detector
The paper describes a novel detector for neutrons in the 1 to 20-MeV energy range with combined imaging and spectroscopic capabilities. The Fast Neutron Imaging Telescope (FNIT) was designed to detect solar neutrons from spacecraft deployed to the inner heliosphere. However, the potential application of this instrument to Special Nuclear Material (SNM) identification was also examined. In either case, neutron detection relies on double elastic neutron-proton (n-p) scattering in liquid scintillator. We optimized the design of FNIT through a combination of Monte Carlo simulations and lab measurements. We then assembled a scaled-down version of the full detector and assessed its performance by exposing it to a neutron beam and an SNM source. The results from these tests, which were used to characterize the response of the complete FNIT detector to fast neutrons, are presented herein
Test and simulation of a Fast Neutron Imaging Telescope
The capability to detect fast neutrons with good angular and energy resolutions is gaining increased interest for different applications such as non-destructive testing, homeland security, and space-borne solar physics. To the latter aim, we recently developed and tested a novel type of instrument, the Fast Neutron Imaging Telescope (FNIT), for neutron spectroscopy and imaging in the 1-20 MeV range. Assessments of the instrument prototype performances, based on Monte Carlo simulations and on results from calibration tests performed in a monoenergetic neutron beam, are presented here. The purpose of the study is twofold: (1) to provide a comprehensive characterization of the prototype response, notably in terms of efficiency, event selection, energy and angular resolution; (2) to validate the simulation tool to support data analysis and reduction, and also to help in the design of more complex fast neutron telescopes