7 research outputs found
An Identity for Kernel Ridge Regression
This paper derives an identity connecting the square loss of ridge regression
in on-line mode with the loss of the retrospectively best regressor. Some
corollaries about the properties of the cumulative loss of on-line ridge
regression are also obtained.Comment: 35 pages; extended version of ALT 2010 paper (Proceedings of ALT
2010, LNCS 6331, Springer, 2010
Online Regression Competitive with Changing Predictors
This paper deals with the problem of making predictions in
the online mode of learning where the dependence of the outcome yt on
the signal xt can change with time. The Aggregating Algorithm (AA)
is a technique that optimally merges experts from a pool, so that the
resulting strategy suffers a cumulative loss that is almost as good as that
of the best expert in the pool. We apply the AA to the case where the
experts are all the linear predictors that can change with time. KAARCh
is the kernel version of the resulting algorithm. In the kernel case, the
experts are all the decision rules in some reproducing kernel Hilbert space
that can change over time. We show that KAARCh suffers a cumulative
square loss that is almost as good as that of any expert that does not
change very rapidly