523 research outputs found
Estimating conditional quantiles with the help of the pinball loss
The so-called pinball loss for estimating conditional quantiles is a
well-known tool in both statistics and machine learning. So far, however, only
little work has been done to quantify the efficiency of this tool for
nonparametric approaches. We fill this gap by establishing inequalities that
describe how close approximate pinball risk minimizers are to the corresponding
conditional quantile. These inequalities, which hold under mild assumptions on
the data-generating distribution, are then used to establish so-called variance
bounds, which recently turned out to play an important role in the statistical
analysis of (regularized) empirical risk minimization approaches. Finally, we
use both types of inequalities to establish an oracle inequality for support
vector machines that use the pinball loss. The resulting learning rates are
min--max optimal under some standard regularity assumptions on the conditional
quantile.Comment: Published in at http://dx.doi.org/10.3150/10-BEJ267 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Fast rates for support vector machines using Gaussian kernels
For binary classification we establish learning rates up to the order of
for support vector machines (SVMs) with hinge loss and Gaussian RBF
kernels. These rates are in terms of two assumptions on the considered
distributions: Tsybakov's noise assumption to establish a small estimation
error, and a new geometric noise condition which is used to bound the
approximation error. Unlike previously proposed concepts for bounding the
approximation error, the geometric noise assumption does not employ any
smoothness assumption.Comment: Published at http://dx.doi.org/10.1214/009053606000001226 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- β¦