106 research outputs found

    Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches

    Get PDF
    Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensin

    Interpretable Hyperspectral AI: When Non-Convex Modeling meets Hyperspectral Remote Sensing

    Full text link
    Hyperspectral imaging, also known as image spectrometry, is a landmark technique in geoscience and remote sensing (RS). In the past decade, enormous efforts have been made to process and analyze these hyperspectral (HS) products mainly by means of seasoned experts. However, with the ever-growing volume of data, the bulk of costs in manpower and material resources poses new challenges on reducing the burden of manual labor and improving efficiency. For this reason, it is, therefore, urgent to develop more intelligent and automatic approaches for various HS RS applications. Machine learning (ML) tools with convex optimization have successfully undertaken the tasks of numerous artificial intelligence (AI)-related applications. However, their ability in handling complex practical problems remains limited, particularly for HS data, due to the effects of various spectral variabilities in the process of HS imaging and the complexity and redundancy of higher dimensional HS signals. Compared to the convex models, non-convex modeling, which is capable of characterizing more complex real scenes and providing the model interpretability technically and theoretically, has been proven to be a feasible solution to reduce the gap between challenging HS vision tasks and currently advanced intelligent data processing models

    Graph-based Data Modeling and Analysis for Data Fusion in Remote Sensing

    Get PDF
    Hyperspectral imaging provides the capability of increased sensitivity and discrimination over traditional imaging methods by combining standard digital imaging with spectroscopic methods. For each individual pixel in a hyperspectral image (HSI), a continuous spectrum is sampled as the spectral reflectance/radiance signature to facilitate identification of ground cover and surface material. The abundant spectrum knowledge allows all available information from the data to be mined. The superior qualities within hyperspectral imaging allow wide applications such as mineral exploration, agriculture monitoring, and ecological surveillance, etc. The processing of massive high-dimensional HSI datasets is a challenge since many data processing techniques have a computational complexity that grows exponentially with the dimension. Besides, a HSI dataset may contain a limited number of degrees of freedom due to the high correlations between data points and among the spectra. On the other hand, merely taking advantage of the sampled spectrum of individual HSI data point may produce inaccurate results due to the mixed nature of raw HSI data, such as mixed pixels, optical interferences and etc. Fusion strategies are widely adopted in data processing to achieve better performance, especially in the field of classification and clustering. There are mainly three types of fusion strategies, namely low-level data fusion, intermediate-level feature fusion, and high-level decision fusion. Low-level data fusion combines multi-source data that is expected to be complementary or cooperative. Intermediate-level feature fusion aims at selection and combination of features to remove redundant information. Decision level fusion exploits a set of classifiers to provide more accurate results. The fusion strategies have wide applications including HSI data processing. With the fast development of multiple remote sensing modalities, e.g. Very High Resolution (VHR) optical sensors, LiDAR, etc., fusion of multi-source data can in principal produce more detailed information than each single source. On the other hand, besides the abundant spectral information contained in HSI data, features such as texture and shape may be employed to represent data points from a spatial perspective. Furthermore, feature fusion also includes the strategy of removing redundant and noisy features in the dataset. One of the major problems in machine learning and pattern recognition is to develop appropriate representations for complex nonlinear data. In HSI processing, a particular data point is usually described as a vector with coordinates corresponding to the intensities measured in the spectral bands. This vector representation permits the application of linear and nonlinear transformations with linear algebra to find an alternative representation of the data. More generally, HSI is multi-dimensional in nature and the vector representation may lose the contextual correlations. Tensor representation provides a more sophisticated modeling technique and a higher-order generalization to linear subspace analysis. In graph theory, data points can be generalized as nodes with connectivities measured from the proximity of a local neighborhood. The graph-based framework efficiently characterizes the relationships among the data and allows for convenient mathematical manipulation in many applications, such as data clustering, feature extraction, feature selection and data alignment. In this thesis, graph-based approaches applied in the field of multi-source feature and data fusion in remote sensing area are explored. We will mainly investigate the fusion of spatial, spectral and LiDAR information with linear and multilinear algebra under graph-based framework for data clustering and classification problems

    A convex model for non-negative matrix factorization and dimensionality reduction on physical space

    Full text link
    A collaborative convex framework for factoring a data matrix XX into a non-negative product ASAS, with a sparse coefficient matrix SS, is proposed. We restrict the columns of the dictionary matrix AA to coincide with certain columns of the data matrix XX, thereby guaranteeing a physically meaningful dictionary and dimensionality reduction. We use l1,l_{1,\infty} regularization to select the dictionary from the data and show this leads to an exact convex relaxation of l0l_0 in the case of distinct noise free data. We also show how to relax the restriction-to-XX constraint by initializing an alternating minimization approach with the solution of the convex model, obtaining a dictionary close to but not necessarily in XX. We focus on applications of the proposed framework to hyperspectral endmember and abundances identification and also show an application to blind source separation of NMR data.Comment: 14 pages, 9 figures. EE and JX were supported by NSF grants {DMS-0911277}, {PRISM-0948247}, MM by the German Academic Exchange Service (DAAD), SO and MM by NSF grants {DMS-0835863}, {DMS-0914561}, {DMS-0914856} and ONR grant {N00014-08-1119}, and GS was supported by NSF, NGA, ONR, ARO, DARPA, and {NSSEFF.

    Hyperspectral Remote Sensing Data Analysis and Future Challenges

    Full text link

    Exploitation of Intra-Spectral Band Correlation for Rapid Feacture Selection and Target Identification in Hyperspectral Imagery

    Get PDF
    This research extends the work produced by Capt. Robert Johnson for detecting target pixels within hyperspectral imagery (HSI). The methodology replaces Principle Components Analysis for dimensionality reduction with a clustering algorithm which seeks to associate spectral rather than spatial dimensions. By seeking similar spectral dimensions, the assumption of no a priori knowledge of the relationship between clustered members can be eliminated and clusters are formed by seeking high correlated adjacent spectral bands. Following dimensionality reduction Independent Components Analysis (ICA) is used to perform feature extraction. Kurtosis and Potential Target Fraction are added to Maximum Component Score and Potential Target Signal to Noise Ratio as mechanisms for discriminating between target and non-target maps. A new methodology exploiting Johnson’s Maximum Distance Secant Line method replaces the first zero bin method for identifying the breakpoint between signal and noise. A parameter known as Left Partial Kurtosis is defined and applied to determine when target pixels are likely to be found in the left tail of each signal histogram. A variable control over the number of iterations of Adaptive Iterative Noise filtering is introduced. Results of this modified algorithm are compared to those of Johnson’s AutoGAD [2007]

    Total Variation and Signature-Based Regularizations on Coupled Nonnegative Matrix Factorization for Data Fusion

    Get PDF
    corecore