research

Gaussian Process Regression with Mismatched Models

Abstract

Learning curves for Gaussian process regression are well understood when the `student' model happens to match the `teacher' (true data generation process). I derive approximations to the learning curves for the more generic case of mismatched models, and find very rich behaviour: For large input space dimensionality, where the results become exact, there are universal (student-independent) plateaux in the learning curve, with transitions in between that can exhibit arbitrarily many over-fitting maxima. In lower dimensions, plateaux also appear, and the asymptotic decay of the learning curve becomes strongly student-dependent. All predictions are confirmed by simulations.Comment: 7 pages, style file nips01e.sty include

    Similar works

    Full text

    thumbnail-image

    Available Versions