126,749 research outputs found
VITALAS at TRECVID-2008
In this paper, we present our experiments in TRECVID 2008 about High-Level feature extraction task. This is the first year for our participation in TRECVID, our system adopts some popular approaches that other workgroups proposed before. We proposed 2 advanced low-level features NEW Gabor texture descriptor and the Compact-SIFT Codeword histogram. Our system applied well-known LIBSVM to train the SVM classifier for the basic classifier. In fusion step, some methods were employed such as the Voting, SVM-base, HCRF and Bootstrap Average AdaBoost(BAAB)
Benchmarking least squares support vector machine classifiers.
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a ( convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually pruning the support value spectrum and optimizing the hyperparameters during the sparse approximation procedure. In this paper, twenty public domain benchmark datasets are used to evaluate the test set performance of LS-SVM classifiers with linear, polynomial and radial basis function (RBF) kernels. Both the SVM and LS-SVM classifier with RBF kernel in combination with standard cross-validation procedures for hyperparameter selection achieve comparable test set performances. These SVM and LS-SVM performances are consistently very good when compared to a variety of methods described in the literature including decision tree based algorithms, statistical algorithms and instance based learning methods. We show on ten UCI datasets that the LS-SVM sparse approximation procedure can be successfully applied.least squares support vector machines; multiclass support vector machines; sparse approximation; discriminant-analysis; sparse approximation; learning algorithms; classification; framework; kernels; time; SISTA;
An investigation of genetic algorithm-based feature selection techniques applied to keystroke dynamics biometrics
Due to the continuous use of social networks, users can be vulnerable to online situations such as paedophilia treats. One of the ways to do the
investigation of an alleged pedophile is to verify the legitimacy of the genre that
it claims. One possible technique to adopt is keystroke dynamics analysis. However, this technique can extract many attributes, causing a negative impact on
the accuracy of the classifier due to the presence of redundant and irrelevant
attributes. Thus, this work using the wrapper approach in features selection using genetic algorithms and as KNN, SVM and Naive Bayes classifiers. Bringing
as best result the SVM classifier with 90% accuracy, identifying what is most
suitable for both bases
Decision Boundaries and Classification Performance Of SVM And KNN Classifiers For 2-Dimensional Dataset
Support Vector Machines (SVM) and K-Nearest Neighborhood (k-NN) are two most popular classifiers in machine learning. In this paper, we intend to study the generalization performance of the two classifiers by visualizing the decision boundary of each classifier when subjected to a two-dimensional (2-D) dataset. Four different sets of database comprising of 2-D datasets namely the eigenpostures of human (EPHuman), the breast cancer (BCancer), the Swiss roll (SRoll) and Twinpeaks (Tpeaks) were used in this study. Results obtained confirmed SVM classifier superb generalization performance since it contributed the lower classification error rate when compared to the k-NN classifier during the training for binary classification of all 2-D datasets. This is evident and can be clearly visualized through the plots depicting the decision boundaries of the binary classification task
- …
