research

Sparse Support Vector Infinite Push

Abstract

In this paper, we address the problem of embedded feature selection for ranking on top of the list problems. We pose this problem as a regularized empirical risk minimization with pp-norm push loss function (p=∞p=\infty) and sparsity inducing regularizers. We leverage the issues related to this challenging optimization problem by considering an alternating direction method of multipliers algorithm which is built upon proximal operators of the loss function and the regularizer. Our main technical contribution is thus to provide a numerical scheme for computing the infinite push loss function proximal operator. Experimental results on toy, DNA microarray and BCI problems show how our novel algorithm compares favorably to competitors for ranking on top while using fewer variables in the scoring function.Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012

    Similar works

    Full text

    thumbnail-image

    Available Versions