149,679 research outputs found
Hyper-Spectral Image Analysis with Partially-Latent Regression and Spatial Markov Dependencies
Hyper-spectral data can be analyzed to recover physical properties at large
planetary scales. This involves resolving inverse problems which can be
addressed within machine learning, with the advantage that, once a relationship
between physical parameters and spectra has been established in a data-driven
fashion, the learned relationship can be used to estimate physical parameters
for new hyper-spectral observations. Within this framework, we propose a
spatially-constrained and partially-latent regression method which maps
high-dimensional inputs (hyper-spectral images) onto low-dimensional responses
(physical parameters such as the local chemical composition of the soil). The
proposed regression model comprises two key features. Firstly, it combines a
Gaussian mixture of locally-linear mappings (GLLiM) with a partially-latent
response model. While the former makes high-dimensional regression tractable,
the latter enables to deal with physical parameters that cannot be observed or,
more generally, with data contaminated by experimental artifacts that cannot be
explained with noise models. Secondly, spatial constraints are introduced in
the model through a Markov random field (MRF) prior which provides a spatial
structure to the Gaussian-mixture hidden variables. Experiments conducted on a
database composed of remotely sensed observations collected from the Mars
planet by the Mars Express orbiter demonstrate the effectiveness of the
proposed model.Comment: 12 pages, 4 figures, 3 table
LWPR: A Scalable Method for Incremental Online Learning in High Dimensions
Locally weighted projection regression (LWPR) is a new algorithm for incremental nonlinear func-
tion approximation in high dimensional spaces with redundant and irrelevant input dimensions. At
its core, it employs nonparametric regression with locally linear models. In order to stay computa-
tionally efficient and numerically robust, each local model performs the regression analysis with a
small number of univariate regressions in selected directions in input space in the spirit of partial
least squares regression. We discuss when and how local learning techniques can successfully work
in high dimensional spaces and compare various techniques for local dimensionality reduction before
finally deriving the LWPR algorithm. The properties of LWPR are that it i) learns rapidly with
second order learning methods based on incremental training, ii) uses statistically sound stochastic
leave-one-out cross validation for learning without the need to memorize training data, iii) adjusts
its weighting kernels based only on local information in order to minimize the danger of negative
interference of incremental learning, iv) has a computational complexity that is linear in the num-
ber of inputs, and v) can deal with a large number of - possibly redundant - inputs, as shown in
various empirical evaluations with up to 50 dimensional data sets. For a probabilistic interpreta-
tion, predictive variance and confidence intervals are derived. To our knowledge, LWPR is the first
truly incremental spatially localized learning method that can successfully and efficiently operate
in very high dimensional spaces
Support vector machine for functional data classification
In many applications, input data are sampled functions taking their values in
infinite dimensional spaces rather than standard vectors. This fact has complex
consequences on data analysis algorithms that motivate modifications of them.
In fact most of the traditional data analysis tools for regression,
classification and clustering have been adapted to functional inputs under the
general name of functional Data Analysis (FDA). In this paper, we investigate
the use of Support Vector Machines (SVMs) for functional data analysis and we
focus on the problem of curves discrimination. SVMs are large margin classifier
tools based on implicit non linear mappings of the considered data into high
dimensional spaces thanks to kernels. We show how to define simple kernels that
take into account the unctional nature of the data and lead to consistent
classification. Experiments conducted on real world data emphasize the benefit
of taking into account some functional aspects of the problems.Comment: 13 page
- …