221 research outputs found

    Supervised classification for a family of Gaussian functional models

    Full text link
    In the framework of supervised classification (discrimination) for functional data, it is shown that the optimal classification rule can be explicitly obtained for a class of Gaussian processes with "triangular" covariance functions. This explicit knowledge has two practical consequences. First, the consistency of the well-known nearest neighbors classifier (which is not guaranteed in the problems with functional data) is established for the indicated class of processes. Second, and more important, parametric and nonparametric plug-in classifiers can be obtained by estimating the unknown elements in the optimal rule. The performance of these new plug-in classifiers is checked, with positive results, through a simulation study and a real data example.Comment: 30 pages, 6 figures, 2 table

    An optimal transportation approach for assessing almost stochastic order

    Full text link
    When stochastic dominance F≤stGF\leq_{st}G does not hold, we can improve agreement to stochastic order by suitably trimming both distributions. In this work we consider the L2−L_2-Wasserstein distance, W2\mathcal W_2, to stochastic order of these trimmed versions. Our characterization for that distance naturally leads to consider a W2\mathcal W_2-based index of disagreement with stochastic order, εW2(F,G)\varepsilon_{\mathcal W_2}(F,G). We provide asymptotic results allowing to test H0:εW2(F,G)≥ε0H_0: \varepsilon_{\mathcal W_2}(F,G)\geq \varepsilon_0 vs Ha:εW2(F,G)<ε0H_a: \varepsilon_{\mathcal W_2}(F,G)<\varepsilon_0, that, under rejection, would give statistical guarantee of almost stochastic dominance. We include a simulation study showing a good performance of the index under the normal model

    The DDG^G-classifier in the functional setting

    Get PDF
    The Maximum Depth was the first attempt to use data depths instead of multivariate raw data to construct a classification rule. Recently, the DD-classifier has solved several serious limitations of the Maximum Depth classifier but some issues still remain. This paper is devoted to extending the DD-classifier in the following ways: first, to surpass the limitation of the DD-classifier when more than two groups are involved. Second to apply regular classification methods (like kkNN, linear or quadratic classifiers, recursive partitioning,...) to DD-plots to obtain useful insights through the diagnostics of these methods. And third, to integrate different sources of information (data depths or multivariate functional data) in a unified way in the classification procedure. Besides, as the DD-classifier trick is especially useful in the functional framework, an enhanced revision of several functional data depths is done in the paper. A simulation study and applications to some classical real datasets are also provided showing the power of the new proposal.Comment: 29 pages, 6 figures, 6 tables, Supplemental R Code and Dat
    • …
    corecore