1,412 research outputs found

    Kernel methods in machine learning

    Full text link
    We review machine learning methods employing positive definite kernels. These methods formulate learning and estimation problems in a reproducing kernel Hilbert space (RKHS) of functions defined on the data domain, expanded in terms of a kernel. Working in linear spaces of function has the benefit of facilitating the construction and analysis of learning algorithms while at the same time allowing large classes of functions. The latter include nonlinear functions as well as functions defined on nonvectorial data. We cover a wide range of methods, ranging from binary classifiers to sophisticated methods for estimation with structured data.Comment: Published in at http://dx.doi.org/10.1214/009053607000000677 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Multiclass Learning with Simplex Coding

    Get PDF
    In this paper we discuss a novel framework for multiclass learning, defined by a suitable coding/decoding strategy, namely the simplex coding, that allows to generalize to multiple classes a relaxation approach commonly used in binary classification. In this framework, a relaxation error analysis can be developed avoiding constraints on the considered hypotheses class. Moreover, we show that in this setting it is possible to derive the first provably consistent regularized method with training/tuning complexity which is independent to the number of classes. Tools from convex analysis are introduced that can be used beyond the scope of this paper

    Feasible Adaptation Criteria for Hybrid Wavelet - Large Margin Classifiers

    Full text link
    In the context of signal classification, this paper assembles and compares criteria to easily judge the discrimination quality of a set of feature vectors. The quality measures are based on the assumption that a Support Vector Machine is used for the final classification. Thus, the ultimate criterion is a large margin separating the two classes. We apply the criteria to control the feature extraction process for signal classification. Adaptive features related to the shape of the signals are extracted by wavelet filtering followed by a nonlinear map. To be able to test many features, the criteria are easily computable while still reliably predicting the classification performance. We also present a novel approach for computing the radius of a set of points in feature space. The radius, in relation to the margin, forms the most commonly used error bound for Support Vector Machines. For isotropic kernels, the problem of radius computation can be reduced to a common Support Vector Machine classification problem

    A Primer on Kernel Methods

    Get PDF
    corecore