409 research outputs found
Spectral-spatial classification of hyperspectral images: three tricks and a new supervised learning setting
Spectral-spatial classification of hyperspectral images has been the subject
of many studies in recent years. In the presence of only very few labeled
pixels, this task becomes challenging. In this paper we address the following
two research questions: 1) Can a simple neural network with just a single
hidden layer achieve state of the art performance in the presence of few
labeled pixels? 2) How is the performance of hyperspectral image classification
methods affected when using disjoint train and test sets? We give a positive
answer to the first question by using three tricks within a very basic shallow
Convolutional Neural Network (CNN) architecture: a tailored loss function, and
smooth- and label-based data augmentation. The tailored loss function enforces
that neighborhood wavelengths have similar contributions to the features
generated during training. A new label-based technique here proposed favors
selection of pixels in smaller classes, which is beneficial in the presence of
very few labeled pixels and skewed class distributions. To address the second
question, we introduce a new sampling procedure to generate disjoint train and
test set. Then the train set is used to obtain the CNN model, which is then
applied to pixels in the test set to estimate their labels. We assess the
efficacy of the simple neural network method on five publicly available
hyperspectral images. On these images our method significantly outperforms
considered baselines. Notably, with just 1% of labeled pixels per class, on
these datasets our method achieves an accuracy that goes from 86.42%
(challenging dataset) to 99.52% (easy dataset). Furthermore we show that the
simple neural network method improves over other baselines in the new
challenging supervised setting. Our analysis substantiates the highly
beneficial effect of using the entire image (so train and test data) for
constructing a model.Comment: Remote Sensing 201
Kernel Multivariate Analysis Framework for Supervised Subspace Learning: A Tutorial on Linear and Kernel Multivariate Methods
Feature extraction and dimensionality reduction are important tasks in many
fields of science dealing with signal processing and analysis. The relevance of
these techniques is increasing as current sensory devices are developed with
ever higher resolution, and problems involving multimodal data sources become
more common. A plethora of feature extraction methods are available in the
literature collectively grouped under the field of Multivariate Analysis (MVA).
This paper provides a uniform treatment of several methods: Principal Component
Analysis (PCA), Partial Least Squares (PLS), Canonical Correlation Analysis
(CCA) and Orthonormalized PLS (OPLS), as well as their non-linear extensions
derived by means of the theory of reproducing kernel Hilbert spaces. We also
review their connections to other methods for classification and statistical
dependence estimation, and introduce some recent developments to deal with the
extreme cases of large-scale and low-sized problems. To illustrate the wide
applicability of these methods in both classification and regression problems,
we analyze their performance in a benchmark of publicly available data sets,
and pay special attention to specific real applications involving audio
processing for music genre prediction and hyperspectral satellite images for
Earth and climate monitoring
Advances in Hyperspectral Image Classification: Earth monitoring with statistical learning methods
Hyperspectral images show similar statistical properties to natural grayscale
or color photographic images. However, the classification of hyperspectral
images is more challenging because of the very high dimensionality of the
pixels and the small number of labeled examples typically available for
learning. These peculiarities lead to particular signal processing problems,
mainly characterized by indetermination and complex manifolds. The framework of
statistical learning has gained popularity in the last decade. New methods have
been presented to account for the spatial homogeneity of images, to include
user's interaction via active learning, to take advantage of the manifold
structure with semisupervised learning, to extract and encode invariances, or
to adapt classifiers and image representations to unseen yet similar scenes.
This tutuorial reviews the main advances for hyperspectral remote sensing image
classification through illustrative examples.Comment: IEEE Signal Processing Magazine, 201
Regularization and Kernelization of the Maximin Correlation Approach
Robust classification becomes challenging when each class consists of
multiple subclasses. Examples include multi-font optical character recognition
and automated protein function prediction. In correlation-based
nearest-neighbor classification, the maximin correlation approach (MCA)
provides the worst-case optimal solution by minimizing the maximum
misclassification risk through an iterative procedure. Despite the optimality,
the original MCA has drawbacks that have limited its wide applicability in
practice. That is, the MCA tends to be sensitive to outliers, cannot
effectively handle nonlinearities in datasets, and suffers from having high
computational complexity. To address these limitations, we propose an improved
solution, named regularized maximin correlation approach (R-MCA). We first
reformulate MCA as a quadratically constrained linear programming (QCLP)
problem, incorporate regularization by introducing slack variables in the
primal problem of the QCLP, and derive the corresponding Lagrangian dual. The
dual formulation enables us to apply the kernel trick to R-MCA so that it can
better handle nonlinearities. Our experimental results demonstrate that the
regularization and kernelization make the proposed R-MCA more robust and
accurate for various classification tasks than the original MCA. Furthermore,
when the data size or dimensionality grows, R-MCA runs substantially faster by
solving either the primal or dual (whichever has a smaller variable dimension)
of the QCLP.Comment: Submitted to IEEE Acces
- …