5 research outputs found

    Solving SVM model selection problem using ACOR and IACOR

    Get PDF
    Ant Colony Optimization (ACO) has been used to solve Support Vector Machine (SVM) model selection problem.ACO originally deals with discrete optimization problem. In applying ACO for optimizing SVM parameters which are continuous variables, there is a need to discretize the continuously value into discrete values.This discretize process would result in loss of some information and hence affect the classification accuracy.In order to enhance SVM performance and solving the discretization problem, this study proposes two algorithms to optimize SVM parameters using Continuous ACO (ACOR) and Incremental Continuous Ant Colony Optimization (IACOR) without the need to discretize continuous value for SVM parameters.Eight datasets from UCI were used to evaluate the credibility of the proposed integrated algorithm in terms of classification accuracy and size of features subset.Promising results were obtained when compared to grid search technique, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM. Results have also shown that IACOR-SVM is better than ACOR-SVM in terms of classification accuracy

    Incremental continuous ant colony optimization technique for support vector machine model selection problem

    Get PDF
    Ant Colony Optimization has been used to solve Support Vector Machine model selection problem.Ant Colony Optimization originally deals with discrete optimization problem. In applying Ant Colony Optimization for optimizing Support Vector Machine parameters which are continuous variables, there is a need to discretize the continuously value into discrete value.This discretize process would result in loss of some information and hence affect the classification accuracy and seeking time. This study proposes an algorithm that can optimize Support Vector Machine parameters using Incremental Continuous Ant Colony Optimization without the need to discretize continuous value for support vector machine parameters.Seven datasets from UCI were used to evaluate the credibility of the proposed hybrid algorithmin terms of classification accuracy.Promising results were obtained when compared to grid search technique

    Integrated ACOR/IACOMV-R-SVM Algorithm

    Get PDF
    A direction for ACO is to optimize continuous and mixed (discrete and continuous) variables in solving problems with various types of data. Support Vector Machine (SVM), which originates from the statistical approach, is a present day classification technique. The main problems of SVM are selecting feature subset and tuning the parameters. Discretizing the continuous value of the parameters is the most common approach in tuning SVM parameters. This process will result in loss of information which affects the classification accuracy. This paper presents two algorithms that can simultaneously tune SVM parameters and select the feature subset. The first algorithm, ACOR-SVM, will tune SVM parameters, while the second IACOMV-R-SVM algorithm will simultaneously tune SVM parameters and select the feature subset. Three benchmark UCI datasets were used in the experiments to validate the performance of the proposed algorithms. The results show that the proposed algorithms have good performances as compared to other approaches

    Incremental continuous ant colony optimization for tuning support vector machine’s parameters

    Get PDF
    Support Vector Machines are considered to be excellent patterns classification techniques. The process of classifying a pattern with high classification accuracy counts mainly on tuning Support Vector Machine parameters which are the generalization error parameter and the kernel function parameter.Tuning these parameters is a complex process and Ant Colony Optimization can be used to overcome the difficulty. Ant Colony Optimization originally deals with discrete optimization problems. Hence, in applying Ant Colony Optimization for optimizing Support Vector Machine parameters, which are continuous in nature, the values wil have to be discretized.The discretization process will result in loss of some information and, hence, affects the classification accuracy and seeks time.This paper presents an algorithm to optimize Support Vector Machine parameters using Incremental continuous Ant Colony Optimization without the need to discretize continuous values.Eight datasets from UCI were used to evaluate the performance of the proposed algorithm.The proposed algorithm demonstrates the credibility in terms of classification accuracy when compared to grid search techniques, GA with feature chromosome-SVM, PSO-SVM, and GA-SVM.Experimental results of the proposed algorithm also show promising performance in terms of classification accuracy and size of features subset

    Hybrid ACO and SVM algorithm for pattern classification

    Get PDF
    Ant Colony Optimization (ACO) is a metaheuristic algorithm that can be used to solve a variety of combinatorial optimization problems. A new direction for ACO is to optimize continuous and mixed (discrete and continuous) variables. Support Vector Machine (SVM) is a pattern classification approach originated from statistical approaches. However, SVM suffers two main problems which include feature subset selection and parameter tuning. Most approaches related to tuning SVM parameters discretize the continuous value of the parameters which will give a negative effect on the classification performance. This study presents four algorithms for tuning the SVM parameters and selecting feature subset which improved SVM classification accuracy with smaller size of feature subset. This is achieved by performing the SVM parameters’ tuning and feature subset selection processes simultaneously. Hybridization algorithms between ACO and SVM techniques were proposed. The first two algorithms, ACOR-SVM and IACOR-SVM, tune the SVM parameters while the second two algorithms, ACOMV-R-SVM and IACOMV-R-SVM, tune the SVM parameters and select the feature subset simultaneously. Ten benchmark datasets from University of California, Irvine, were used in the experiments to validate the performance of the proposed algorithms. Experimental results obtained from the proposed algorithms are better when compared with other approaches in terms of classification accuracy and size of the feature subset. The average classification accuracies for the ACOR-SVM, IACOR-SVM, ACOMV-R and IACOMV-R algorithms are 94.73%, 95.86%, 97.37% and 98.1% respectively. The average size of feature subset is eight for the ACOR-SVM and IACOR-SVM algorithms and four for the ACOMV-R and IACOMV-R algorithms. This study contributes to a new direction for ACO that can deal with continuous and mixed-variable ACO
    corecore