research

Dynamics of Supervised Learning with Restricted Training Sets

Abstract

We study the dynamics of supervised learning in layered neural networks, in the regime where the size pp of the training set is proportional to the number NN of inputs. Here the local fields are no longer described by Gaussian probability distributions. We show how dynamical replica theory can be used to predict the evolution of macroscopic observables, including the relevant performance measures, incorporating the old formalism in the limit Ξ±=p/Nβ†’βˆž\alpha=p/N\to\infty as a special case. For simplicity we restrict ourselves to single-layer networks and realizable tasks.Comment: 36 pages, latex2e, 12 eps figures (to be publ in: Proc Newton Inst Workshop on On-Line Learning '97

    Similar works

    Full text

    thumbnail-image

    Available Versions