44 research outputs found
Fast projections onto mixed-norm balls with applications
Joint sparsity offers powerful structural cues for feature selection,
especially for variables that are expected to demonstrate a "grouped" behavior.
Such behavior is commonly modeled via group-lasso, multitask lasso, and related
methods where feature selection is effected via mixed-norms. Several mixed-norm
based sparse models have received substantial attention, and for some cases
efficient algorithms are also available. Surprisingly, several constrained
sparse models seem to be lacking scalable algorithms. We address this
deficiency by presenting batch and online (stochastic-gradient) optimization
methods, both of which rely on efficient projections onto mixed-norm balls. We
illustrate our methods by applying them to the multitask lasso. We conclude by
mentioning some open problems.Comment: Preprint of paper under revie
Top-k Multiclass SVM
Class ambiguity is typical in image classification problems with a large
number of classes. When classes are difficult to discriminate, it makes sense
to allow k guesses and evaluate classifiers based on the top-k error instead of
the standard zero-one loss. We propose top-k multiclass SVM as a direct method
to optimize for top-k performance. Our generalization of the well-known
multiclass SVM is based on a tight convex upper bound of the top-k error. We
propose a fast optimization scheme based on an efficient projection onto the
top-k simplex, which is of its own interest. Experiments on five datasets show
consistent improvements in top-k accuracy compared to various baselines.Comment: NIPS 201