Feature subset selection in large dimensionality domains

Abstract

Searching for an optimal feature subset from a high dimensional feature space is known to be an NP-complete problem. We present a hybrid algorithm, SAGA, for this task. SAGA combines the ability to avoid being trapped in a local minimum of Simulated Annealing with the very high rate of convergence of the crossover operator of Genetic Algorithms, the strong local search ability of greedy algorithms and the high computational efficiency of Generalized Regression Neural Networks. We compare the performance over time of SAGA and well-known algorithms on synthetic and real datasets. The results show that SAGA outperforms existing algorithms

Similar works

This paper was published in Stirling Online Research Repository.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.