2 research outputs found

    A Fast Revised Simplex Method For Svm Training

    No full text
    Active set methods for training the Support Vector Machines (SVM) are advantageous since they enable incremental training and, as we show in this research, do not exhibit exponentially increasing training times commonly associated with the decomposition methods as the SVM training parameter, C, is increased or the classification difficulty increases. Previous implementations of the active set method must contend with singularities, especially associated with the linear kernel, and must compute infinite descent directions, which may be inefficient, especially as C is increased. In this research, we propose a revised simplex method for quadratic programming, which has a guarantee of non-singularity for the sub-problem, and show how this can be adapted to SVM training. © 2008 IEEE
    corecore