2,334 research outputs found
Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent
The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose a family of random coordinate descent algorithms to directly minimize the 0/1 loss for perceptrons, and prove their convergence. Our algorithms are computationally efficient, and usually achieve the lowest 0/1 loss compared with other algorithms. Such advantages make them favorable for nonseparable real-world problems. Experiments show that our algorithms are especially useful for ensemble learning, and could achieve the lowest test error for many complex data sets when coupled with AdaBoost
Perceptron learning with random coordinate descent
A perceptron is a linear threshold classifier that separates examples with a hyperplane. It is perhaps the simplest learning model that is used standalone. In this paper, we propose a family of random coordinate descent algorithms for perceptron learning on binary classification problems. Unlike most perceptron learning algorithms which require smooth cost functions, our algorithms directly minimize the training error, and usually achieve the lowest training error compared with other algorithms. The algorithms are also computational efficient. Such advantages make them favorable for both standalone use and ensemble learning, on problems that are not linearly separable. Experiments show that our algorithms work very well with AdaBoost, and achieve the lowest test errors for half of the datasets
New multicategory boosting algorithms based on multicategory Fisher-consistent losses
Fisher-consistent loss functions play a fundamental role in the construction
of successful binary margin-based classifiers. In this paper we establish the
Fisher-consistency condition for multicategory classification problems. Our
approach uses the margin vector concept which can be regarded as a
multicategory generalization of the binary margin. We characterize a wide class
of smooth convex loss functions that are Fisher-consistent for multicategory
classification. We then consider using the margin-vector-based loss functions
to derive multicategory boosting algorithms. In particular, we derive two new
multicategory boosting algorithms by using the exponential and logistic
regression losses.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS198 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- …