221 research outputs found
Supervised classification for a family of Gaussian functional models
In the framework of supervised classification (discrimination) for functional
data, it is shown that the optimal classification rule can be explicitly
obtained for a class of Gaussian processes with "triangular" covariance
functions. This explicit knowledge has two practical consequences. First, the
consistency of the well-known nearest neighbors classifier (which is not
guaranteed in the problems with functional data) is established for the
indicated class of processes. Second, and more important, parametric and
nonparametric plug-in classifiers can be obtained by estimating the unknown
elements in the optimal rule. The performance of these new plug-in classifiers
is checked, with positive results, through a simulation study and a real data
example.Comment: 30 pages, 6 figures, 2 table
An optimal transportation approach for assessing almost stochastic order
When stochastic dominance does not hold, we can improve
agreement to stochastic order by suitably trimming both distributions. In this
work we consider the Wasserstein distance, , to stochastic
order of these trimmed versions. Our characterization for that distance
naturally leads to consider a -based index of disagreement with
stochastic order, . We provide asymptotic
results allowing to test vs , that,
under rejection, would give statistical guarantee of almost stochastic
dominance. We include a simulation study showing a good performance of the
index under the normal model
The DD-classifier in the functional setting
The Maximum Depth was the first attempt to use data depths instead of
multivariate raw data to construct a classification rule. Recently, the
DD-classifier has solved several serious limitations of the Maximum Depth
classifier but some issues still remain. This paper is devoted to extending the
DD-classifier in the following ways: first, to surpass the limitation of the
DD-classifier when more than two groups are involved. Second to apply regular
classification methods (like NN, linear or quadratic classifiers, recursive
partitioning,...) to DD-plots to obtain useful insights through the diagnostics
of these methods. And third, to integrate different sources of information
(data depths or multivariate functional data) in a unified way in the
classification procedure. Besides, as the DD-classifier trick is especially
useful in the functional framework, an enhanced revision of several functional
data depths is done in the paper. A simulation study and applications to some
classical real datasets are also provided showing the power of the new
proposal.Comment: 29 pages, 6 figures, 6 tables, Supplemental R Code and Dat
- …