425,608 research outputs found

    Statistical analysis of hyper-spectral data: a non-Gaussian approach

    Get PDF
    We investigate the statistical modeling of hyper-spectral data. The accurate modeling of experimental data is critical in target detection and classification applications. In fact, having a statistical model that is capable of properly describing data variability leads to the derivation of the best decision strategies together with a reliable assessment of algorithm performance. Most existing classification and target detection algorithms are based on the multivariate Gaussian model which, in many cases, deviates from the true statistical behavior of hyper-spectral data. This motivated us to investigate the capability of non-Gaussian models to represent data variability in each background class. In particular, we refer to models based on elliptically contoured (EC) distributions. We consider multivariate EC-t distribution and two distinct mixture models based on EC distributions. We describe the methodology adopted for the statistical analysis and we propose a technique to automatically estimate the unknown parameters of statistical models. Finally, we discuss the results obtained by analyzing data gathered by the multispectral infrared and visible imaging spectrometer (MIVIS) sensor

    A manifold learning approach to target detection in high-resolution hyperspectral imagery

    Get PDF
    Imagery collected from airborne platforms and satellites provide an important medium for remotely analyzing the content in a scene. In particular, the ability to detect a specific material within a scene is of high importance to both civilian and defense applications. This may include identifying targets such as vehicles, buildings, or boats. Sensors that process hyperspectral images provide the high-dimensional spectral information necessary to perform such analyses. However, for a d-dimensional hyperspectral image, it is typical for the data to inherently occupy an m-dimensional space, with m \u3c\u3c d. In the remote sensing community, this has led to a recent increase in the use of manifold learning, which aims to characterize the embedded lower-dimensional, non-linear manifold upon which the hyperspectral data inherently lie. Classic hyperspectral data models include statistical, linear subspace, and linear mixture models, but these can place restrictive assumptions on the distribution of the data; this is particularly true when implementing traditional target detection approaches, and the limitations of these models are well-documented. With manifold learning based approaches, the only assumption is that the data reside on an underlying manifold that can be discretely modeled by a graph. The research presented here focuses on the use of graph theory and manifold learning in hyperspectral imagery. Early work explored various graph-building techniques with application to the background model of the Topological Anomaly Detection (TAD) algorithm, which is a graph theory based approach to anomaly detection. This led towards a focus on target detection, and in the development of a specific graph-based model of the data and subsequent dimensionality reduction using manifold learning. An adaptive graph is built on the data, and then used to implement an adaptive version of locally linear embedding (LLE). We artificially induce a target manifold and incorporate it into the adaptive LLE transformation; the artificial target manifold helps to guide the separation of the target data from the background data in the new, lower-dimensional manifold coordinates. Then, target detection is performed in the manifold space

    Physics-Based Detection of Subpixel Targets in Hyperspectral Imagery

    Get PDF
    Hyperspectral imagery provides the ability to detect targets that are smaller than the size of a pixel. They provide this ability by measuring the reflection and absorption of light at different wavelengths creating a spectral signature for each pixel in the image. This spectral signature contains information about the different materials within the pixel; therefore, the challenge in subpixel target detection lies in separating the target's spectral signature from competing background signatures. Most research has approached this problem in a purely statistical manner. Our approach fuses statistical signal processing techniques with the physics of reflectance spectroscopy and radiative transfer theory. Using this approach, we provide novel algorithms for all aspects of subpixel detection from parameter estimation to threshold determination. Characterization of the target and background spectral signatures is a key part of subpixel detection. We develop an algorithm to generate target signatures based on radiative transfer theory using only the image and a reference signature without the need for calibration, weather information, or source-target-receiver geometries. For background signatures, our work identifies that even slight estimation errors in the number of background signatures can severely degrade detection performance. To this end, we present a new method to estimate the number of background signatures specifically for subpixel target detection. At the core of the dissertation is the development of two hybrid detectors which fuse spectroscopy with statistical hypothesis testing. Our results show that the hybrid detectors provide improved performance in three different ways: insensitivity to the number of background signatures, improved detection performance, and consistent performance across multiple images leading to improved receiver operating characteristic curves. Lastly, we present a novel adaptive threshold estimate via extreme value theory. The method can be used on any detector type - not just those that are constant false alarm rate (CFAR) detectors. Even on CFAR detectors our proposed method can estimate thresholds that are better than theoretical predictions due to the inherent mismatch between the CFAR model assumptions and real data. Additionally, our method works in the presence of target detections while still estimating an accurate threshold for a desired false alarm rate

    Open-target sparse sensing of biological agents using DNA microarray

    Get PDF
    Background Current biosensors are designed to target and react to specific nucleic acid sequences or structural epitopes. These 'target-specific' platforms require creation of new physical capture reagents when new organisms are targeted. An 'open-target' approach to DNA microarray biosensing is proposed and substantiated using laboratory generated data. The microarray consisted of 12,900 25 bp oligonucleotide capture probes derived from a statistical model trained on randomly selected genomic segments of pathogenic prokaryotic organisms. Open-target detection of organisms was accomplished using a reference library of hybridization patterns for three test organisms whose DNA sequences were not included in the design of the microarray probes. Results A multivariate mathematical model based on the partial least squares regression (PLSR) was developed to detect the presence of three test organisms in mixed samples. When all 12,900 probes were used, the model correctly detected the signature of three test organisms in all mixed samples (mean(R2)) = 0.76, CI = 0.95), with a 6% false positive rate. A sampling algorithm was then developed to sparsely sample the probe space for a minimal number of probes required to capture the hybridization imprints of the test organisms. The PLSR detection model was capable of correctly identifying the presence of the three test organisms in all mixed samples using only 47 probes (mean(R2)) = 0.77, CI = 0.95) with nearly 100% specificity. Conclusions We conceived an 'open-target' approach to biosensing, and hypothesized that a relatively small, non-specifically designed, DNA microarray is capable of identifying the presence of multiple organisms in mixed samples. Coupled with a mathematical model applied to laboratory generated data, and sparse sampling of capture probes, the prototype microarray platform was able to capture the signature of each organism in all mixed samples with high sensitivity and specificity. It was demonstrated that this new approach to biosensing closely follows the principles of sparse sensing.Mitre Corporatio

    Robust Bayesian target detection algorithm for depth imaging from sparse single-photon data

    Get PDF
    This paper presents a new Bayesian model and associated algorithm for depth and intensity profiling using full waveforms from time-correlated single-photon counting (TCSPC) measurements in the limit of very low photon counts (i.e., typically less than 20 photons per pixel). The model represents each Lidar waveform as an unknown constant background level, which is combined in the presence of a target, to a known impulse response weighted by the target intensity and finally corrupted by Poisson noise. The joint target detection and depth imaging problem is expressed as a pixel-wise model selection and estimation problem which is solved using Bayesian inference. Prior knowledge about the problem is embedded in a hierarchical model that describes the dependence structure between the model parameters while accounting for their constraints. In particular, Markov random fields (MRFs) are used to model the joint distribution of the background levels and of the target presence labels, which are both expected to exhibit significant spatial correlations. An adaptive Markov chain Monte Carlo algorithm including reversible-jump updates is then proposed to compute the Bayesian estimates of interest. This algorithm is equipped with a stochastic optimization adaptation mechanism that automatically adjusts the parameters of the MRFs by maximum marginal likelihood estimation. Finally, the benefits of the proposed methodology are demonstrated through a series of experiments using real data.Comment: arXiv admin note: text overlap with arXiv:1507.0251
    corecore