2 research outputs found

    On â„“p\ell_p-Support Vector Machines and Multidimensional Kernels

    Full text link
    In this paper, we extend the methodology developed for Support Vector Machines (SVM) using ℓ2\ell_2-norm (ℓ2\ell_2-SVM) to the more general case of ℓp\ell_p-norms with p≥1p\ge 1 (ℓp\ell_p-SVM). The resulting primal and dual problems are formulated as mathematical programming problems; namely, in the primal case, as a second order cone optimization problem and in the dual case, as a polynomial optimization problem involving homogeneous polynomials. Scalability of the primal problem is obtained via general transformations based on the expansion of functionals in Schauder spaces. The concept of Kernel function, widely applied in ℓ2\ell_2-SVM, is extended to the more general case by defining a new operator called multidimensional Kernel. This object gives rise to reformulations of dual problems, in a transformed space of the original data, which are solved by a moment-sdp based approach. The results of some computational experiments on real-world datasets are presented showing rather good behavior in terms of standard indicators such a \textit{accuracy index} and its ability to classify new data.Comment: 27 paes, 2 Figures, 2 table

    Optimal arrangements of hyperplanes for multiclass classification

    Full text link
    In this paper, we present a novel approach to construct multiclass classifiers by means of arrangements of hyperplanes. We propose different mixed integer (linear and non linear) programming formulations for the problem using extensions of widely used measures for misclassifying observations where the \textit{kernel trick} can be adapted to be applicable. Some dimensionality reductions and variable fixing strategies are also developed for these models. An extensive battery of experiments has been run which reveal the powerfulness of our proposal as compared with other previously proposed methodologies.Comment: 8 Figures, 2 Table
    corecore