44 research outputs found

    Extending twin support vector machine classifier for multi-category classification problems

    Get PDF
    © 2013 – IOS Press and the authors. All rights reservedTwin support vector machine classifier (TWSVM) was proposed by Jayadeva et al., which was used for binary classification problems. TWSVM not only overcomes the difficulties in handling the problem of exemplar unbalance in binary classification problems, but also it is four times faster in training a classifier than classical support vector machines. This paper proposes one-versus-all twin support vector machine classifiers (OVA-TWSVM) for multi-category classification problems by utilizing the strengths of TWSVM. OVA-TWSVM extends TWSVM to solve k-category classification problems by developing k TWSVM where in the ith TWSVM, we only solve the Quadratic Programming Problems (QPPs) for the ith class, and get the ith nonparallel hyperplane corresponding to the ith class data. OVA-TWSVM uses the well known one-versus-all (OVA) approach to construct a corresponding twin support vector machine classifier. We analyze the efficiency of the OVA-TWSVM theoretically, and perform experiments to test its efficiency on both synthetic data sets and several benchmark data sets from the UCI machine learning repository. Both the theoretical analysis and experimental results demonstrate that OVA-TWSVM can outperform the traditional OVA-SVMs classifier. Further experimental comparisons with other multiclass classifiers demonstrated that comparable performance could be achieved.This work is supported in part by the grant of the Fundamental Research Funds for the Central Universities of GK201102007 in PR China, and is also supported by Natural Science Basis Research Plan in Shaanxi Province of China (Program No.2010JM3004), and is at the same time supported by Chinese Academy of Sciences under the Innovative Group Overseas Partnership Grant as well as Natural Science Foundation of China Major International Joint Research Project (NO.71110107026)

    Nonparallel support vector machines for pattern classification

    Get PDF
    We propose a novel nonparallel classifier, called nonparallel support vector machine (NPSVM), for binary classification. Our NPSVM that is fully different from the existing nonparallel classifiers, such as the generalized eigenvalue proximal support vector machine (GEPSVM) and the twin support vector machine (TWSVM), has several incomparable advantages: 1) two primal problems are constructed implementing the structural risk minimization principle; 2) the dual problems of these two primal problems have the same advantages as that of the standard SVMs, so that the kernel trick can be applied directly, while existing TWSVMs have to construct another two primal problems for nonlinear cases based on the approximate kernel-generated surfaces, furthermore, their nonlinear problems cannot degenerate to the linear case even the linear kernel is used; 3) the dual problems have the same elegant formulation with that of standard SVMs and can certainly be solved efficiently by sequential minimization optimization algorithm, while existing GEPSVM or TWSVMs are not suitable for large scale problems; 4) it has the inherent sparseness as standard SVMs; 5) existing TWSVMs are only the special cases of the NPSVM when the parameters of which are appropriately chosen. Experimental results on lots of datasets show the effectiveness of our method in both sparseness and classification accuracy, and therefore, confirm the above conclusion further. In some sense, our NPSVM is a new starting point of nonparallel classifiers

    Machine Learning based Early Stage Identification of Liver Tumor using Ultrasound Images

    Get PDF
    Liver cancer is one of the most malignant diseases and its diagnosis requires more computational time. It can be minimized by applying a Machine learning algorithm for the diagnosis of cancer. The existing machine learning technique uses only the color-based methods to classify images which are not efficient. So, it is proposed to use texture-based classification for diagnosis. The input image is resized and pre-processed by Gaussian filters. The features are extracted by applying Gray level co-occurrence matrix (GLCM) and Local binary pattern (LBP in the preprocessed image. The Local Binary Pattern (LBP) is an efficient texture operator which labels the pixels of an image by thresholding the neighborhood of each pixel and considers the result as a binary number. The extracted features are classified by multi-support vector machine (Multi SVM) and K-Nearest Neighbor (K-NN) algorithms. The Advantage of combining SVM with KNN is that SVM measures a large number of values whereas KNN accurately measures point values. The results obtained from the proposed techniques achieved high precision, accuracy, sensitivity and specificity than the existing method

    Study on support vector machine as a classifier

    Get PDF
    SVM [1], [2] is a learning method which learns by considering data points to be in space. We studied different types of Support Vector Machine (SVM). We also observed their classification process. We conducted10-fold testing experiments on LSSVM [7], [8] (Least square Support Vector Machine) and PSVM [9] (Proximal Support Vector Machine) using standard sets of data. Finally we proposed a new algorithm NPSVM (Non-Parallel Support Vector Machine) which is reformulated from NPPC [12], [13] (Non-Parallel Plane Classifier). We have observed that the cost function of NPPC is affected by the additional constraint for Euclidean distance classification. So we implicitly normalized the weight vectors instead of the additional constraint. As a result we could generate a very good cost function. The computational complexity of NPSVM for both linear and non-linear kernel is evaluated. The results of 10-fold test using standard data sets of NPSVM are compared with the LSSVM and PSVM

    Study on proximal support vector machine as a classifier

    Get PDF
    Proximal Support Vector machine based on Least Mean Square Algorithm classi-fiers (LMS-SVM) are tools for classification of binary data. Proximal Support Vector based on Least Mean Square Algorithm classifiers is completely based on the theory of Proximal Support Vector Machine classifiers (PSVM). PSVM classifies binary pat- terns by assigning them to the closest of two parallel planes that are pushed apart as far as possible. The training time for the classifier is found to be faster compared to their previous versions of Support Vector Machines. But due to the presence of slack variable or error vector the classification accuracy of the Proximal Support Vector Machine is less. So we have come with an idea to update the adjustable weight vectors at the training phase such that all the data points fall out-side the region of separation and falls on the correct side of the hyperplane and to enlarge the width of the separable region.To implement this idea, Least Mean Square (LMS) algorithm is used to modify the adjustable weight vectors. Here, the error is represented by the minimum distance of data points from the margin of the region of separation of the data points that falls inside the region of separation or makes a misclassification and distance of data points from the separating hyperplane for the data points that falls on the wrong side of the hyperplane. This error is minimized using a modification of adjustable weight vectors. Therefore, as the number of iterations of the LMS algorithm increases, weight vector performs a random walk (Brownian motion) about the solution of optimal hy-perplane having a maximal margin that minimizes the error. Experimental results show that the proposed method classifies the binary pattern more accurately than classical Proximal Support Vector Machine classifiers
    corecore