14,671 research outputs found
Generalized Linear Models for Geometrical Current predictors. An application to predict garment fit
The aim of this paper is to model an ordinal response variable in terms
of vector-valued functional data included on a vector-valued RKHS. In particular,
we focus on the vector-valued RKHS obtained when a geometrical object (body) is
characterized by a current and on the ordinal regression model. A common way to
solve this problem in functional data analysis is to express the data in the orthonormal
basis given by decomposition of the covariance operator. But our data present very important differences with respect to the usual functional data setting. On the one
hand, they are vector-valued functions, and on the other, they are functions in an
RKHS with a previously defined norm. We propose to use three different bases: the
orthonormal basis given by the kernel that defines the RKHS, a basis obtained from
decomposition of the integral operator defined using the covariance function, and a
third basis that combines the previous two. The three approaches are compared and
applied to an interesting problem: building a model to predict the fit of children’s
garment sizes, based on a 3D database of the Spanish child population. Our proposal
has been compared with alternative methods that explore the performance of other
classifiers (Suppport Vector Machine and k-NN), and with the result of applying
the classification method proposed in this work, from different characterizations of
the objects (landmarks and multivariate anthropometric measurements instead of
currents), obtaining in all these cases worst results
Learning Sets with Separating Kernels
We consider the problem of learning a set from random samples. We show how
relevant geometric and topological properties of a set can be studied
analytically using concepts from the theory of reproducing kernel Hilbert
spaces. A new kind of reproducing kernel, that we call separating kernel, plays
a crucial role in our study and is analyzed in detail. We prove a new analytic
characterization of the support of a distribution, that naturally leads to a
family of provably consistent regularized learning algorithms and we discuss
the stability of these methods with respect to random sampling. Numerical
experiments show that the approach is competitive, and often better, than other
state of the art techniques.Comment: final versio
- …