1,605 research outputs found
MOBILE, HYBRID COMPTON/CODED APERTURE IMAGING FOR DETECTION, IDENTIFICATION AND LOCALIZATION OF GAMMA-RAY SOURCES AT STAND-OFF DISTANCES
The Stand-off radiation detection system (SORDS) program is an advanced technology demonstration (ATD) project through the Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray spectroscopic and imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity provided by multiple detection modes while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of gamma-ray spectroscopy and two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 NaI crystals (5x5x2 in each), arranged in a random coded aperture CA, followed by 30 position sensitive NaI bars (24x2.5x3 in each) called the DA. The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, the implemented spectroscopic, coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than any mode alone. Since the TMI is a moving system, peripheral data, such as a GPS and INS must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the GEANT4. Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as localization capability. Utilizing imaging information will show signal-to-noise gains over spectroscopic algorithms alone
Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches
Imaging spectrometers measure electromagnetic energy scattered in their
instantaneous field view in hundreds or thousands of spectral channels with
higher spectral resolution than multispectral cameras. Imaging spectrometers
are therefore often referred to as hyperspectral cameras (HSCs). Higher
spectral resolution enables material identification via spectroscopic analysis,
which facilitates countless applications that require identifying materials in
scenarios unsuitable for classical spectroscopic analysis. Due to low spatial
resolution of HSCs, microscopic material mixing, and multiple scattering,
spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus,
accurate estimation requires unmixing. Pixels are assumed to be mixtures of a
few materials, called endmembers. Unmixing involves estimating all or some of:
the number of endmembers, their spectral signatures, and their abundances at
each pixel. Unmixing is a challenging, ill-posed inverse problem because of
model inaccuracies, observation noise, environmental conditions, endmember
variability, and data set size. Researchers have devised and investigated many
models searching for robust, stable, tractable, and accurate unmixing
algorithms. This paper presents an overview of unmixing methods from the time
of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models
are first discussed. Signal-subspace, geometrical, statistical, sparsity-based,
and spatial-contextual unmixing algorithms are described. Mathematical problems
and potential solutions are described. Algorithm characteristics are
illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of
Selected Topics in Applied Earth Observations and Remote Sensin
Machine Learning on Neutron and X-Ray Scattering
Neutron and X-ray scattering represent two state-of-the-art materials
characterization techniques that measure materials' structural and dynamical
properties with high precision. These techniques play critical roles in
understanding a wide variety of materials systems, from catalysis to polymers,
nanomaterials to macromolecules, and energy materials to quantum materials. In
recent years, neutron and X-ray scattering have received a significant boost
due to the development and increased application of machine learning to
materials problems. This article reviews the recent progress in applying
machine learning techniques to augment various neutron and X-ray scattering
techniques. We highlight the integration of machine learning methods into the
typical workflow of scattering experiments. We focus on scattering problems
that faced challenge with traditional methods but addressable using machine
learning, such as leveraging the knowledge of simple materials to model more
complicated systems, learning with limited data or incomplete labels,
identifying meaningful spectra and materials' representations for learning
tasks, mitigating spectral noise, and many others. We present an outlook on a
few emerging roles machine learning may play in broad types of scattering and
spectroscopic problems in the foreseeable future.Comment: 56 pages, 12 figures. Feedback most welcom
Triplicated P-wave measurements for waveform tomography of the mantle transition zone
Triplicated body waves sample the mantle transition zone more extensively than any other wave type, and interact strongly with the discontinuities at 410 km and 660 km. Since the seismograms bear a strong imprint of these geodynamically interesting features, it is highly desirable to invert them for structure of the transition zone. This has rarely been attempted, due to a mismatch between the complex and band-limited data and the (ray-theoretical) modelling methods. Here we present a data processing and modelling strategy to harness such broadband seismograms for finite-frequency tomography. We include triplicated P-waves (epicentral distance range between 14 and 30°) across their entire broadband frequency range, for both deep and shallow sources. We show that is it possible to predict the complex sequence of arrivals in these seismograms, but only after a careful effort to estimate source time functions and other source parameters from data, variables that strongly influence the waveforms. Modelled and observed waveforms then yield decent cross-correlation fits, from which we measure finite-frequency traveltime anomalies. We discuss two such data sets, for North America and Europe, and conclude that their signal quality and azimuthal coverage should be adequate for tomographic inversion. In order to compute sensitivity kernels at the pertinent high body wave frequencies, we use fully numerical forward modelling of the seismic wavefield through a spherically symmetric Earth
Methods for detecting and characterising clusters of galaxies
The main theme of this PhD-thesis is the observation of clusters of galaxies at submillimetric wavelengths. The Sunyaev-Zel'dovich (SZ) effect due to interaction of cosmic microwave background (CMB) photons with electrons of the hot intra-cluster medium causes a distinct modulation in the spectrum of the CMB and is a very promising tool for detecting clusters out to very large distances. Especially the European PLANCK-mission, a satellite dedicated to the mapping of CMB anisotropies, will be the first experiment to routinely detect clusters of galaxies by their SZ-signature. This thesis presents an extensive simulation of PLANCK's SZ-capabilities, that combines all-sky maps of the SZ-effect with a realisation of the fluctuating CMB and submillimetric emission components of the Milky Way and of the Solar system, and takes instrumental issues such as the satellite's point-spread function, the frequency response, scan paths and detector noise of the receivers into account.
For isolating the weak SZ-signal in the presence of overwhelming spurious components with complicated correlation properties across PLANCK's channels, multifrequency filters based on matched and scale-adaptive filtering have been extended to spherical topologies and applied to simulated data. These filters were shown to efficiently amplify and extract the SZ-signal by combining spatial band-filtering and linear combination of observations at different frequencies, where the filter shapes and the linear combination coefficients follow from the cross- and autocorrelation properties of the sky maps, the anticipated profile of SZ clusters and the known SZ spectral dependence. The characterisation of the resulting SZ-sample yielded a total number of 6000 detections above a statistical significance of 3 sigma and the distribution of detected clusters in mass, redshift, and position on the sky.
In a related project, a method of constructing morphological distance estimators for resolved SZ cluster images is proposed. This method measures a cluster's SZ-morphology by wavelet decomposition. It was shown that the spectrum of wavelet moments can be modeled by elementary functions and has characteristic properties that are non-degenerate and indicative of cluster distance. Distance accuracies following from a maximum likelihood approach yielded values as good as 5% for the relative deviation, and deteriorate only slightly when noise components such as instrumental noise or CMB fluctuations were added. Other complications like cool cores of clusters and finite instrumental resolution were shown not to affect the wavelet distance estimation method significantly.
Another line of research is the Rees-Sciama (RS) effect, which is due to gravitational interaction of CMB photons with non-stationary potential wells. This effect was shown to be a second order gravitational lensing effect arising in the post-Newtonian expansion of general relativity and measures the divergence of gravitomagnetic potentials integrated along the line-of-sight. The spatial autocorrelation function of the Rees-Sciama effect was derived in perturbation theory and projected to yield the angular autocorrelation function while taking care of the differing time evolution of the various terms emerging in the perturbation expansion. The RS-effect was shown to be detectable by PLANCK as a correction to the primordial CMB power spectrum at low multipoles. Within the same perturbative formalism, the gravitomagnetic corrections to the autocorrelation function of weak gravitational lensing observables such as cosmic shear could be determined. It was shown that those corrections are most important on the largest scales beyond 1~Gpc, which are difficult to access observationally. For contemporary weak lensing surveys, gravitomagnetic corrections were confirmed not play a significant role.
A byproduct of the simulation of CMB fluctuations on the basis of Gaussian random fields was a new way of generating coded mask patterns for X-ray and gamma-ray imaging. Coded mask cameras observe a source by recording the shadow cast by a mask onto a position-sensitive detector. The distribution of sources can be reconstructed from this shadowgram by correlation techniques. By using Gaussian random fields, coded mask patterns can be specifically tailored for a predefined point-spread function which yields significant advantages with respect to sensitivity in the observation of extended sources while providing a moderate performance compared to traditional mask generation schemes in the observation of point sources. Coded mask patterns encoding Gaussian point-spread functions have been subjected to extensive ray-tracing studies where their performance has been evaluated
- …