Skip to main content
Article thumbnail
Location of Repository

Optimal Rates for Regularized Least Squares Regression

By Ingo Steinwart, Don Hush and Clint Scovel

Abstract

We establish a new oracle inequality for kernelbased, regularized least squares regression methods, which uses the eigenvalues of the associated integral operator as a complexity measure. We then use this oracle inequality to derive learning rates for these methods. Here, it turns out that these rates are independent of the exponent of the regularization term. Finally, we show that our learning rates are asymptotically optimal whenever, e.g., the kernel is continuous and the input space is a compact metric space

Year: 2009
OAI identifier: oai:CiteSeerX.psu:10.1.1.162.3075
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cs.mcgill.ca/~colt2... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.