5 research outputs found
Improved Vapnik Cervonenkis bounds
We give a new proof of VC bounds where we avoid the use of symmetrization and
use a shadow sample of arbitrary size. We also improve on the variance term.
This results in better constants, as shown on numerical examples. Moreover our
bounds still hold for non identically distributed independent random variables.
Keywords: Statistical learning theory, PAC-Bayesian theorems, VC dimension