6,719 research outputs found
Representation of Functional Data in Neural Networks
Functional Data Analysis (FDA) is an extension of traditional data analysis
to functional data, for example spectra, temporal series, spatio-temporal
images, gesture recognition data, etc. Functional data are rarely known in
practice; usually a regular or irregular sampling is known. For this reason,
some processing is needed in order to benefit from the smooth character of
functional data in the analysis methods. This paper shows how to extend the
Radial-Basis Function Networks (RBFN) and Multi-Layer Perceptron (MLP) models
to functional data inputs, in particular when the latter are known through
lists of input-output pairs. Various possibilities for functional processing
are discussed, including the projection on smooth bases, Functional Principal
Component Analysis, functional centering and reduction, and the use of
differential operators. It is shown how to incorporate these functional
processing into the RBFN and MLP models. The functional approach is illustrated
on a benchmark of spectrometric data analysis.Comment: Also available online from:
http://www.sciencedirect.com/science/journal/0925231
Piecewise linear regularized solution paths
We consider the generic regularized optimization problem
. Efron, Hastie,
Johnstone and Tibshirani [Ann. Statist. 32 (2004) 407--499] have shown that for
the LASSO--that is, if is squared error loss and is
the norm of --the optimal coefficient path is piecewise linear,
that is, is piecewise
constant. We derive a general characterization of the properties of (loss ,
penalty ) pairs which give piecewise linear coefficient paths. Such pairs
allow for efficient generation of the full regularized coefficient paths. We
investigate the nature of efficient path following algorithms which arise. We
use our results to suggest robust versions of the LASSO for regression and
classification, and to develop new, efficient algorithms for existing problems
in the literature, including Mammen and van de Geer's locally adaptive
regression splines.Comment: Published at http://dx.doi.org/10.1214/009053606000001370 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Periodic Splines and Gaussian Processes for the Resolution of Linear Inverse Problems
This paper deals with the resolution of inverse problems in a periodic
setting or, in other terms, the reconstruction of periodic continuous-domain
signals from their noisy measurements. We focus on two reconstruction
paradigms: variational and statistical. In the variational approach, the
reconstructed signal is solution to an optimization problem that establishes a
tradeoff between fidelity to the data and smoothness conditions via a quadratic
regularization associated to a linear operator. In the statistical approach,
the signal is modeled as a stationary random process defined from a Gaussian
white noise and a whitening operator; one then looks for the optimal estimator
in the mean-square sense. We give a generic form of the reconstructed signals
for both approaches, allowing for a rigorous comparison of the two.We fully
characterize the conditions under which the two formulations yield the same
solution, which is a periodic spline in the case of sampling measurements. We
also show that this equivalence between the two approaches remains valid on
simulations for a broad class of problems. This extends the practical range of
applicability of the variational method
L1 Control Theoretic Smoothing Splines
In this paper, we propose control theoretic smoothing splines with L1
optimality for reducing the number of parameters that describes the fitted
curve as well as removing outlier data. A control theoretic spline is a
smoothing spline that is generated as an output of a given linear dynamical
system. Conventional design requires exactly the same number of base functions
as given data, and the result is not robust against outliers. To solve these
problems, we propose to use L1 optimality, that is, we use the L1 norm for the
regularization term and/or the empirical risk term. The optimization is
described by a convex optimization, which can be efficiently solved via a
numerical optimization software. A numerical example shows the effectiveness of
the proposed method.Comment: Accepted for publication in IEEE Signal Processing Letters. 4 pages
(twocolumn), 5 figure
Representing functional data in reproducing Kernel Hilbert Spaces with applications to clustering and classification
Functional data are difficult to manage for many traditional statistical techniques given their very high (or intrinsically infinite) dimensionality. The reason is that functional data are essentially functions and most algorithms are designed to work with (low) finite-dimensional vectors. Within this context we propose techniques to obtain finitedimensional representations of functional data. The key idea is to consider each functional curve as a point in a general function space and then project these points onto a Reproducing Kernel Hilbert Space with the aid of Regularization theory. In this work we describe the projection method, analyze its theoretical properties and propose a model selection procedure to select appropriate Reproducing Kernel Hilbert spaces to project the functional data.Functional data, Reproducing, Kernel Hilbert Spaces, Regularization theory
- …