2 research outputs found

    Regularización y métodos Kernel para algoritmos de clasificación

    Get PDF
    Este trabajo es el resultado de estudio de las técnicas de regularización y métodos Kernel empleados para algoritmos de clasificación. El método Mínimos Cuadrados Regularizados RLSC requiere la solución de un único problema de ecuaciones lineales, lo cual presenta ventajas en términos computacionales, eso hace que sea un método sencillo para realizar esta importante tarea. Este trabajo se enfoca en automatizar la selección de parámetros , tanto de regularización, como del Kernel, empleando Validación Cuadrada Generalizada GCV, para la versión lineal y no lineal de RLSC. Se evalúa la exactitud del clasificador y se compara con otros métodos bastante conocidos en la literatura, obteniendo resultados importantes en términos de desempeño y costo del algoritmo / Abstract: This work is result of studying the regularization techniques and Kernel methods for classification algorithms. The Regularized Least Square method RLSC uses the solution of linear equations problem, which has advantages in computational terms, this makes it an easy way to perform this important task. This work focuses on the automatic selection of parameters, both regularization and the Kernel, using Generalized Cross Validation GCV for linear and non-linear version of RLSC. We evaluate the accuracy of the classifier and compared with other methods employ in the literature, obtaining significant results in terms of performance and cost of the algorithmMaestrí

    Efficient Regularized Least Squares Classification

    No full text
    Kernel-based regularized least squares (RLS) algorithms are a promising technique for classification. RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. In contrast, support vector machines (SVMs) implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of the solution. In this paper, we introduce a very fast version of the RLS algorithm while maintaining the achievable level of performance. The proposed new algorithm computes solutions in O(m) time and O(1) space, where m is the number of training points. We demonstrate the efficacy of our very fast RLS algorithm using a number of (both real simulated) data sets
    corecore