CORE
CO
nnecting
RE
positories
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Research partnership
About
About
About us
Our mission
Team
Blog
FAQs
Contact us
Community governance
Governance
Advisory Board
Board of supporters
Research network
Innovations
Our research
Labs
research
Learnability of Gaussians with flexible variances
Authors
Yiming Ying
Ding-Xuan Zhou
Publication date
22 July 2013
Publisher
Microtome Publishing
Abstract
Copyright © 2007 Yiming Ying and Ding-Xuan ZhouGaussian kernels with flexible variances provide a rich family of Mercer kernels for learning algorithms. We show that the union of the unit balls of reproducing kernel Hilbert spaces generated by Gaussian kernels with fexible variances is a uniform Glivenko-Cantelli (uGC) class. This result confirms a conjecture concerning learnability of Gaussian kernels and verifies the uniform convergence of many learning algorithms involving Gaussians with changing variances. Rademacher averages and empirical covering numbers are used to estimate sample errors of multi-kernel regularization schemes associated with general loss functions. It is then shown that the regularization error associated with the least square loss and the Gaussian kernels can be greatly improved when °exible variances are allowed. Finally, for regularization schemes generated by Gaussian kernels with fexible variances we present explicit learning rates for regression with least square loss and classification with hinge loss
Similar works
Full text
Open in the Core reader
Download PDF
Available Versions
Supporting member
Open Research Exeter
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:ore.exeter.ac.uk:10871/119...
Last time updated on 06/08/2013