2,225 research outputs found

    Fuzzy Least Squares Twin Support Vector Machines

    Full text link
    Least Squares Twin Support Vector Machine (LST-SVM) has been shown to be an efficient and fast algorithm for binary classification. It combines the operating principles of Least Squares SVM (LS-SVM) and Twin SVM (T-SVM); it constructs two non-parallel hyperplanes (as in T-SVM) by solving two systems of linear equations (as in LS-SVM). Despite its efficiency, LST-SVM is still unable to cope with two features of real-world problems. First, in many real-world applications, labels of samples are not deterministic; they come naturally with their associated membership degrees. Second, samples in real-world applications may not be equally important and their importance degrees affect the classification. In this paper, we propose Fuzzy LST-SVM (FLST-SVM) to deal with these two characteristics of real-world data. Two models are introduced for FLST-SVM: the first model builds up crisp hyperplanes using training samples and their corresponding membership degrees. The second model, on the other hand, constructs fuzzy hyperplanes using training samples and their membership degrees. Numerical evaluation of the proposed method with synthetic and real datasets demonstrate significant improvement in the classification accuracy of FLST-SVM when compared to well-known existing versions of SVM

    A fuzzy based Lagrangian twin parametric-margin support vector machine (FLTPMSVM)

    Full text link
    © 2017 IEEE. In the spirit of twin parametric-margin support vector machine (TPMSVM) and support vector machine based on fuzzy membership values (FSVM), a new method termed as fuzzy based Lagrangian twin parametric-margin support vector machine (FLTPMSVM) is proposed in this paper to reduce the effect of the outliers. In FLTPMSVM, we assign the weights to each data samples on the basis of fuzzy membership values to reduce the effect of outliers. Also, we consider the square of the 2-norm of slack variables to make the objective function strongly convex and find the solution of the proposed FLTPMSVM by solving simple linearly convergent iterative schemes instead of solving a pair of quadratic programming problems as in case of SVM, TWSVM, FTSVM and TPMSVM. No need of external toolbox is required for FLTPMSVM. The numerical experiments are performed on artificial as well as well known real-world datasets which show that our proposed FLTPMSVM is having better generalization performance and less training cost in comparison to support vector machine, twin support vector machine, fuzzy twin support vector machine and twin parametric-margin support vector machine

    Nonparallel support vector machines for pattern classification

    Get PDF
    We propose a novel nonparallel classifier, called nonparallel support vector machine (NPSVM), for binary classification. Our NPSVM that is fully different from the existing nonparallel classifiers, such as the generalized eigenvalue proximal support vector machine (GEPSVM) and the twin support vector machine (TWSVM), has several incomparable advantages: 1) two primal problems are constructed implementing the structural risk minimization principle; 2) the dual problems of these two primal problems have the same advantages as that of the standard SVMs, so that the kernel trick can be applied directly, while existing TWSVMs have to construct another two primal problems for nonlinear cases based on the approximate kernel-generated surfaces, furthermore, their nonlinear problems cannot degenerate to the linear case even the linear kernel is used; 3) the dual problems have the same elegant formulation with that of standard SVMs and can certainly be solved efficiently by sequential minimization optimization algorithm, while existing GEPSVM or TWSVMs are not suitable for large scale problems; 4) it has the inherent sparseness as standard SVMs; 5) existing TWSVMs are only the special cases of the NPSVM when the parameters of which are appropriately chosen. Experimental results on lots of datasets show the effectiveness of our method in both sparseness and classification accuracy, and therefore, confirm the above conclusion further. In some sense, our NPSVM is a new starting point of nonparallel classifiers
    • …
    corecore