28 research outputs found

    GPU Parallel Implementation of Dual-Depth Sparse Probabilistic Latent Semantic Analysis for Hyperspectral Unmixing

    Get PDF
    Hyperspectral unmixing (HU) is an important task for remotely sensed hyperspectral (HS) data exploitation. It comprises the identification of pure spectral signatures (endmembers) and their corresponding fractional abundances in each pixel of the HS data cube. Several methods have been developed for (semi-) supervised and automatic identification of endmembers and abundances. Recently, the statistical dual-depth sparse probabilistic latent semantic analysis (DEpLSA) method has been developed to tackle the HU problem as a latent topic-based approach in which both endmembers and abundances can be simultaneously estimated according to the semantics encapsulated by the latent topic space. However, statistical models usually lead to computationally demanding algorithms and the computational time of the DEpLSA is often too high for practical use, in particular, when the dimensionality of the HS data cube is large. In order to mitigate this limitation, this article resorts to graphical processing units (GPUs) to provide a new parallel version of the DEpLSA, developed using the NVidia compute device unified architecture. Our experimental results, conducted using four well-known HS datasets and two different GPU architectures (GTX 1080 and Tesla P100), show that our parallel versions of the DEpLSA and the traditional pLSA approach can provide accurate HU results fast enough for practical use, accelerating the corresponding serial versions in at least 30x in the GTX 1080 and up to 147x in the Tesla P100 GPU, which are quite significant acceleration factors that increase with the image size, thus allowing for the possibility of the fast processing of massive HS data repositories

    Mineral identification using data-mining in hyperspectral infrared imagery

    Get PDF
    Les applications de l’imagerie infrarouge dans le domaine de la géologie sont principalement des applications hyperspectrales. Elles permettent entre autre l’identification minérale, la cartographie, ainsi que l’estimation de la portée. Le plus souvent, ces acquisitions sont réalisées in-situ soit à l’aide de capteurs aéroportés, soit à l’aide de dispositifs portatifs. La découverte de minéraux indicateurs a permis d’améliorer grandement l’exploration minérale. Ceci est en partie dû à l’utilisation d’instruments portatifs. Dans ce contexte le développement de systèmes automatisés permettrait d’augmenter à la fois la qualité de l’exploration et la précision de la détection des indicateurs. C’est dans ce cadre que s’inscrit le travail mené dans ce doctorat. Le sujet consistait en l’utilisation de méthodes d’apprentissage automatique appliquées à l’analyse (au traitement) d’images hyperspectrales prises dans les longueurs d’onde infrarouge. L’objectif recherché étant l’identification de grains minéraux de petites tailles utilisés comme indicateurs minéral -ogiques. Une application potentielle de cette recherche serait le développement d’un outil logiciel d’assistance pour l’analyse des échantillons lors de l’exploration minérale. Les expériences ont été menées en laboratoire dans la gamme relative à l’infrarouge thermique (Long Wave InfraRed, LWIR) de 7.7m à 11.8 m. Ces essais ont permis de proposer une méthode pour calculer l’annulation du continuum. La méthode utilisée lors de ces essais utilise la factorisation matricielle non négative (NMF). En utlisant une factorisation du premier ordre on peut déduire le rayonnement de pénétration, lequel peut ensuite être comparé et analysé par rapport à d’autres méthodes plus communes. L’analyse des résultats spectraux en comparaison avec plusieurs bibliothèques existantes de données a permis de mettre en évidence la suppression du continuum. Les expérience ayant menés à ce résultat ont été conduites en utilisant une plaque Infragold ainsi qu’un objectif macro LWIR. L’identification automatique de grains de différents matériaux tels que la pyrope, l’olivine et le quartz a commencé. Lors d’une phase de comparaison entre des approches supervisées et non supervisées, cette dernière s’est montrée plus approprié en raison du comportement indépendant par rapport à l’étape d’entraînement. Afin de confirmer la qualité de ces résultats quatre expériences ont été menées. Lors d’une première expérience deux algorithmes ont été évalués pour application de regroupements en utilisant l’approche FCC (False Colour Composite). Cet essai a permis d’observer une vitesse de convergence, jusqu’a vingt fois plus rapide, ainsi qu’une efficacité significativement accrue concernant l’identification en comparaison des résultats de la littérature. Cependant des essais effectués sur des données LWIR ont montré un manque de prédiction de la surface du grain lorsque les grains étaient irréguliers avec présence d’agrégats minéraux. La seconde expérience a consisté, en une analyse quantitaive comparative entre deux bases de données de Ground Truth (GT), nommée rigid-GT et observed-GT (rigide-GT: étiquet manuel de la région, observée-GT:étiquetage manuel les pixels). La précision des résultats était 1.5 fois meilleur lorsque l’on a utlisé la base de données observed-GT que rigid-GT. Pour les deux dernières epxérience, des données venant d’un MEB (Microscope Électronique à Balayage) ainsi que d’un microscopie à fluorescence (XRF) ont été ajoutées. Ces données ont permis d’introduire des informations relatives tant aux agrégats minéraux qu’à la surface des grains. Les résultats ont été comparés par des techniques d’identification automatique des minéraux, utilisant ArcGIS. Cette dernière a montré une performance prometteuse quand à l’identification automatique et à aussi été utilisée pour la GT de validation. Dans l’ensemble, les quatre méthodes de cette thèse représentent des méthodologies bénéfiques pour l’identification des minéraux. Ces méthodes présentent l’avantage d’être non-destructives, relativement précises et d’avoir un faible coût en temps calcul ce qui pourrait les qualifier pour être utilisée dans des conditions de laboratoire ou sur le terrain.The geological applications of hyperspectral infrared imagery mainly consist in mineral identification, mapping, airborne or portable instruments, and core logging. Finding the mineral indicators offer considerable benefits in terms of mineralogy and mineral exploration which usually involves application of portable instrument and core logging. Moreover, faster and more mechanized systems development increases the precision of identifying mineral indicators and avoid any possible mis-classification. Therefore, the objective of this thesis was to create a tool to using hyperspectral infrared imagery and process the data through image analysis and machine learning methods to identify small size mineral grains used as mineral indicators. This system would be applied for different circumstances to provide an assistant for geological analysis and mineralogy exploration. The experiments were conducted in laboratory conditions in the long-wave infrared (7.7μm to 11.8μm - LWIR), with a LWIR-macro lens (to improve spatial resolution), an Infragold plate, and a heating source. The process began with a method to calculate the continuum removal. The approach is the application of Non-negative Matrix Factorization (NMF) to extract Rank-1 NMF and estimate the down-welling radiance and then compare it with other conventional methods. The results indicate successful suppression of the continuum from the spectra and enable the spectra to be compared with spectral libraries. Afterwards, to have an automated system, supervised and unsupervised approaches have been tested for identification of pyrope, olivine and quartz grains. The results indicated that the unsupervised approach was more suitable due to independent behavior against training stage. Once these results obtained, two algorithms were tested to create False Color Composites (FCC) applying a clustering approach. The results of this comparison indicate significant computational efficiency (more than 20 times faster) and promising performance for mineral identification. Finally, the reliability of the automated LWIR hyperspectral infrared mineral identification has been tested and the difficulty for identification of the irregular grain’s surface along with the mineral aggregates has been verified. The results were compared to two different Ground Truth(GT) (i.e. rigid-GT and observed-GT) for quantitative calculation. Observed-GT increased the accuracy up to 1.5 times than rigid-GT. The samples were also examined by Micro X-ray Fluorescence (XRF) and Scanning Electron Microscope (SEM) in order to retrieve information for the mineral aggregates and the grain’s surface (biotite, epidote, goethite, diopside, smithsonite, tourmaline, kyanite, scheelite, pyrope, olivine, and quartz). The results of XRF imagery compared with automatic mineral identification techniques, using ArcGIS, and represented a promising performance for automatic identification and have been used for GT validation. In overall, the four methods (i.e. 1.Continuum removal methods; 2. Classification or clustering methods for mineral identification; 3. Two algorithms for clustering of mineral spectra; 4. Reliability verification) in this thesis represent beneficial methodologies to identify minerals. These methods have the advantages to be a non-destructive, relatively accurate and have low computational complexity that might be used to identify and assess mineral grains in the laboratory conditions or in the field

    Hyperspectral Image Unmixing Incorporating Adjacency Information

    Get PDF
    While the spectral information contained in hyperspectral images is rich, the spatial resolution of such images is in many cases very low. Many pixel spectra are mixtures of pure materials’ spectra and therefore need to be decomposed into their constituents. This work investigates new decomposition methods taking into account spectral, spatial and global 3D adjacency information. This allows for faster and more accurate decomposition results

    Advances in Hyperspectral Image Classification Methods for Vegetation and Agricultural Cropland Studies

    Get PDF
    Hyperspectral data are becoming more widely available via sensors on airborne and unmanned aerial vehicle (UAV) platforms, as well as proximal platforms. While space-based hyperspectral data continue to be limited in availability, multiple spaceborne Earth-observing missions on traditional platforms are scheduled for launch, and companies are experimenting with small satellites for constellations to observe the Earth, as well as for planetary missions. Land cover mapping via classification is one of the most important applications of hyperspectral remote sensing and will increase in significance as time series of imagery are more readily available. However, while the narrow bands of hyperspectral data provide new opportunities for chemistry-based modeling and mapping, challenges remain. Hyperspectral data are high dimensional, and many bands are highly correlated or irrelevant for a given classification problem. For supervised classification methods, the quantity of training data is typically limited relative to the dimension of the input space. The resulting Hughes phenomenon, often referred to as the curse of dimensionality, increases potential for unstable parameter estimates, overfitting, and poor generalization of classifiers. This is particularly problematic for parametric approaches such as Gaussian maximum likelihoodbased classifiers that have been the backbone of pixel-based multispectral classification methods. This issue has motivated investigation of alternatives, including regularization of the class covariance matrices, ensembles of weak classifiers, development of feature selection and extraction methods, adoption of nonparametric classifiers, and exploration of methods to exploit unlabeled samples via semi-supervised and active learning. Data sets are also quite large, motivating computationally efficient algorithms and implementations. This chapter provides an overview of the recent advances in classification methods for mapping vegetation using hyperspectral data. Three data sets that are used in the hyperspectral classification literature (e.g., Botswana Hyperion satellite data and AVIRIS airborne data over both Kennedy Space Center and Indian Pines) are described in Section 3.2 and used to illustrate methods described in the chapter. An additional high-resolution hyperspectral data set acquired by a SpecTIR sensor on an airborne platform over the Indian Pines area is included to exemplify the use of new deep learning approaches, and a multiplatform example of airborne hyperspectral data is provided to demonstrate transfer learning in hyperspectral image classification. Classical approaches for supervised and unsupervised feature selection and extraction are reviewed in Section 3.3. In particular, nonlinearities exhibited in hyperspectral imagery have motivated development of nonlinear feature extraction methods in manifold learning, which are outlined in Section 3.3.1.4. Spatial context is also important in classification of both natural vegetation with complex textural patterns and large agricultural fields with significant local variability within fields. Approaches to exploit spatial features at both the pixel level (e.g., co-occurrencebased texture and extended morphological attribute profiles [EMAPs]) and integration of segmentation approaches (e.g., HSeg) are discussed in this context in Section 3.3.2. Recently, classification methods that leverage nonparametric methods originating in the machine learning community have grown in popularity. An overview of both widely used and newly emerging approaches, including support vector machines (SVMs), Gaussian mixture models, and deep learning based on convolutional neural networks is provided in Section 3.4. Strategies to exploit unlabeled samples, including active learning and metric learning, which combine feature extraction and augmentation of the pool of training samples in an active learning framework, are outlined in Section 3.5. Integration of image segmentation with classification to accommodate spatial coherence typically observed in vegetation is also explored, including as an integrated active learning system. Exploitation of multisensor strategies for augmenting the pool of training samples is investigated via a transfer learning framework in Section 3.5.1.2. Finally, we look to the future, considering opportunities soon to be provided by new paradigms, as hyperspectral sensing is becoming common at multiple scales from ground-based and airborne autonomous vehicles to manned aircraft and space-based platforms

    A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications

    Get PDF
    Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms

    Mapping urban surface materials using imaging spectroscopy data

    Get PDF
    Die Kartierung der städtische Oberflächenmaterialien ist aufgrund der komplexen räumlichen Muster eine Herausforderung. Daten von bildgebenden Spektrometern können hierbei durch die feine und kontinuierliche Abtastung des elektromagnetischen Spektrums detaillierte spektrale Merkmale von Oberflächenmaterialien erkennen, was mit multispektralen oder RGB-Bildern nicht mit der gleichen Genauigkeit erreicht werden kann. Bislang wurden in zahlreichen Studien zur Kartierung von städtischen Oberflächenmaterialien Daten von flugzeuggestützten abbildenden Spektrometern mit hoher räumlicher Auflösung verwendet, die ihr Potenzial unter Beweis stellen und gute Ergebnisse liefern. Im Vergleich zu diesen Sensoren haben weltraumgestützte abbildende Spektrometer eine regionale oder globale Abdeckung, eine hohe Wiederholbarkeit und vermeiden teure, zeit- und arbeitsaufwändige Flugkampagnen. Allerdings liegt die räumliche Auflösung der aktuellen weltraumgestützten abbildenden Spektroskopiedaten bei etwa 30 m, was zu einem Mischpixelproblem führt, welches mit herkömmlichen Kartierungsansätzen nur schwer zu bewältigen ist. Das Hauptziel dieser Studie ist die Kartierung städtischer Materialien mit bildgebenden Spektroskopiedaten in verschiedenen Maßstäben und die gleichzeitige Nutzung des Informationsgehalts dieser Daten, um die chemischen und physikalischen Eigenschaften von Oberflächenmaterialien zu erfassen sowie das Mischpixelproblem zu berücksichtigen. Konkret zielt diese Arbeit darauf ab, (1) photovoltaische Solarmodule mit Hilfe von luftgestützten bildgebenden Spektroskopiedaten auf der Grundlage ihrer spektralen Merkmale zu kartieren; (2) die Robustheit der Stichprobe von städtischen Materialgradienten zu untersuchen; (3) die Übertragbarkeit von städtischen Materialgradienten auf andere Gebiete zu analysieren.Mapping urban surface materials is challenging due to the complex spatial patterns. Data from imaging spectrometers can identify detailed spectral features of surface materials through the fine and continuous sampling of the electromagnetic spectrum, which cannot be achieved with the same accuracy using multispectral or RGB images. To date, numerous studies in urban surface material mapping have been using data from airborne imaging spectrometers with high spatial resolution, demonstrating the potential and providing good results. Compared to these sensors, spaceborne imaging spectrometers have regional or global coverage, high repeatability, and avoid expensive, time-consuming, and labor-intensive flight campaigns. However, the spatial resolution of current spaceborne imaging spectroscopy data (also known as hyperspectral data) is about 30 m, resulting in a mixed pixel problem that is challenging to handle with conventional mapping approaches. The main objective of this study is to perform urban surface material mapping with imaging spectroscopy data at different spatial scales, simultaneously explore the information content of these data to detect the chemical and physical properties of surface materials, and take the mixed-pixel problem into account. Specifically, this thesis aims to (1) map solar photovoltaic modules using airborne imaging spectroscopy data based on their spectral features; (2) investigate the sampling robustness of urban material gradients; (3) analyze the area transferability of urban material gradients

    Fusion de données provenant de différents capteurs satellitaires pour le suivi de la qualité de l'eau en zones côtières. Application au littoral de la région PACA

    Get PDF
    Monitoring coastal areas requires both a good spatial resolution, good spectral resolution associated with agood signal to noise ratio and finally a good temporal resolution to visualize rapid changes in water color.Available now, and even those planed soon, sensors do not provide both a good spatial, spectral ANDtemporal resolution. In this study, we are interested in the image fusion of two future sensors which are bothpart of the Copernicus program of the European Space Agency: MSI on Sentinel-2 and OLCI on Sentinel-3.Such as MSI and OLCI do not provide image yet, it was necessary to simulate them. We then used thehyperspectral imager HICO and we then proposed three methods: an adaptation of the method ARSIS fusionof multispectral images (ARSIS), a fusion method based on the non-negative factorization tensors (Tensor)and a fusion method based on the inversion de matrices (Inversion).These three methods were first evaluated using statistical parameters between images obtained by fusionand the "perfect" image as well as the estimation results of biophysical parameters obtained by minimizingthe radiative transfer model in water.Le suivi des zones côtières nécessite à la fois une bonne résolution spatiale, une bonne résolution spectraleassociée à un bon rapport signal sur bruit et enfin une bonne résolution temporelle pour visualiser deschangements rapides de couleur de l’eau.Les capteurs disponibles actuellement, et même ceux prévus prochainement, n’apportent pas à la fois unebonne résolution spatiale, spectrale ET temporelle. Dans cette étude, nous nous intéressons à la fusion de 2futurs capteurs qui s’inscrivent tous deux dans le programme Copernicus de l’agence spatiale européenne:MSI sur Sentinel-2 et OLCI sur Sentinel-3.Comme les capteurs MSI et OLCI ne fournissent pas encore d’images, il a fallu les simuler. Pour cela nousavons eu recours aux images hyperspectrales du capteur HICO. Nous avons alors proposé 3 méthodes : uneadaptation de la méthode ARSIS à la fusion d’images multispectrales (ARSIS), une méthode de fusion baséesur la factorisation de tenseurs non-négatifs (Tenseur) et une méthode de fusion basée sur l’inversion dematrices (Inversion)Ces 3 méthodes ont tout d’abord été évaluées à l’aide de paramètres statistiques entre les images obtenuespar fusion et l’image « parfaite » ainsi que sur les résultats d’estimation de paramètres biophysiques obtenuspar minimisation du modèle de transfert radiatif dans l’eau

    Change Detection Methods for Remote Sensing in the Last Decade: A Comprehensive Review

    Get PDF
    Change detection is an essential and widely utilized task in remote sensing that aims to detect and analyze changes occurring in the same geographical area over time, which has broad applications in urban development, agricultural surveys, and land cover monitoring. Detecting changes in remote sensing images is a complex challenge due to various factors, including variations in image quality, noise, registration errors, illumination changes, complex landscapes, and spatial heterogeneity. In recent years, deep learning has emerged as a powerful tool for feature extraction and addressing these challenges. Its versatility has resulted in its widespread adoption for numerous image-processing tasks. This paper presents a comprehensive survey of significant advancements in change detection for remote sensing images over the past decade. We first introduce some preliminary knowledge for the change detection task, such as problem definition, datasets, evaluation metrics, and transformer basics, as well as provide a detailed taxonomy of existing algorithms from three different perspectives: algorithm granularity, supervision modes, and frameworks in the Methodology section. This survey enables readers to gain systematic knowledge of change detection tasks from various angles. We then summarize the state-of-the-art performance on several dominant change detection datasets, providing insights into the strengths and limitations of existing algorithms. Based on our survey, some future research directions for change detection in remote sensing are well identified. This survey paper sheds some light the topic for the community and will inspire further research efforts in the change detection task.</jats:p

    Advances in Possibilistic Clustering with Application to Hyperspectral Image Processing

    Get PDF
    Η ομαδοποίηση δεδομένων είναι μια εδραιωμένη μεθοδολογία ανάλυσης δεδομένων που έχει χρησιμοποιηθεί εκτενώς σε διάφορα πεδία εφαρμογών κατά τη διάρκεια των τελευταίων δεκαετιών. Η παρούσα διατριβή εστιάζει κυρίως στην ευρύτερη οικογένεια των αλγορίθμων βελτιστοποίησης κόστους και πιο συγκεκριμένα στους αλγόριθμους ομαδοποίησης με βάση τα ενδεχόμενα (Possibilistic c-Means, PCM). Συγκεκριμένα, αφού εκτίθενται τα αδύνατα σημεία τους, προτείνονται νέοι (batch και online) PCM αλγόριθμοι που αποτελούν επεκτάσεις των προηγουμένων και αντιμετωπίζουν τα αδύνατα σημεία των πρώτων. Οι προτεινόμενοι αλγόριθμοι ομαδοποίησης βασίζονται κυρίως στην υιοθέτηση των εννοιών (α) της προσαρμοστικότητας παραμέτρων (parameter adaptivity), οι οποίες στους κλασσικούς PCM αλγορίθμους παραμένουν σταθερές κατά την εκτέλεσή τους και (β) της αραιότητας (sparsity). Αυτά τα χαρακτηριστικά προσδίδουν νέα δυναμική στους προτεινόμενους αλγορίθμους οι οποίοι πλέον: (α) είναι (κατ&apos; αρχήν) σε θέση να προσδιορίσουν τον πραγματικό αριθμό των φυσικών ομάδων που σχηματίζονται από τα δεδομένα, (β) είναι ικανοί να αποκαλύψουν την υποκείμενη δομή ομαδοποίησης, ακόμη και σε δύσκολες περιπτώσεις, όπου οι φυσικές ομάδες βρίσκονται κοντά η μία στην άλλη ή/και έχουν σημαντικές διαφορές στις διακυμάνσεις ή/και στις πυκνότητές τους και (γ) είναι εύρωστοι στην παρουσία θορύβου και ακραίων σημείων. Επίσης, δίνονται θεωρητικά αποτελέσματα σχετικά με τη σύγκλιση των προτεινόμενων αλγορίθμων, τα οποία βρίσκουν επίσης εφαρμογή και στους κλασσικούς PCM αλγορίθμους. Η δυναμική των προτεινόμενων αλγορίθμων αναδεικνύεται μέσω εκτεταμένων πειραμάτων, τόσο σε συνθετικά όσο και σε πραγματικά δεδομένα. Επιπλέον, οι αλγόριθμοι αυτοί έχουν εφαρμοστεί με επιτυχία στο ιδιαίτερα απαιτητικό πρόβλημα της ομαδοποίησης σε υπερφασματικές εικόνες. Τέλος, αναπτύχθηκε και μια μέθοδος επιλογής χαρακτηριστικών κατάλληλη για υπερφασματικές εικόνες.Clustering is a well established data analysis methodology that has been extensively used in various fields of applications during the last decades. The main focus of the present thesis is on a well-known cost-function optimization-based family of clustering algorithms, called Possibilistic C-Means (PCM) algorithms. Specifically, the shortcomings of PCM algorithms are exposed and novel batch and online PCM schemes are proposed to cope with them. These schemes rely on (i) the adaptation of certain parameters which remain fixed during the execution of the original PCMs and (ii) the adoption of sparsity. The incorporation of these two characteristics renders the proposed schemes: (a) capable, in principle, to reveal the true number of physical clusters formed by the data, (b) capable to uncover the underlying clustering structure even in demanding cases, where the physical clusters are closely located to each other and/or have significant differences in their variances and/or densities, and (c) immune to the presence of noise and outliers. Moreover, theoretical results concerning the convergence of the proposed algorithms, also applicable to the classical PCMs, are provided. The potential of the proposed methods is demonstrated via extensive experimentation on both synthetic and real data sets. In addition, they have been successfully applied on the challenging problem of clustering in HyperSpectral Images (HSIs). Finally, a feature selection technique suitable for HSIs has also been developed
    corecore