F/sub beta / support vector machines


We introduce in this paper F/sub beta / SVMs, a new parametrization of support vector machines. It allows to optimize a SVM in terms of F/sub beta /, a classical information retrieval criterion, instead of the usual classification rate. Experiments illustrate the advantages of this approach with respect to the traditional 2-norm soft-margin SVM when precision and recall are of unequal importance. An automatic model selection procedure based on the generalization F/sub beta / score is introduced. It relies on the results of Chapelle, Vapnjk et al. (2002) about the use of gradient-based techniques in SVM model selection. The derivatives of a F /sub beta / loss function with respect to the hyperparameters C and the width sigma of a gaussian kernel are formally defined. The model is then selected by performing a gradient descent of the F/sub beta / loss function over the set of hyperparameters. Experiments on artificial and real-life data show the benefits of this method when the F/sub beta / score is considered.Anglai

Similar works

Full text

oai:dial.uclouvain.be:boreal:67945Last time updated on 5/14/2016

This paper was published in DIAL UCLouvain.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.

We use cookies to improve our website.

Learn more