2 research outputs found

    Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers

    No full text
    We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates predictions for the evolution in time of training- and generalization errors, and extends the class of mathematically solvable learning processes in large neural networks to those complicated situations where overfitting occurs. 1 Introduction Tools from statistical mechanics have been used successfully over the last decade to study the dynamics of learning in large layered neural networks (for a review see e.g. [1] or [2]). The simplest mathematical theories result upon assuming the data set to be much larger than the number of weight updates made, which rules out recycling and ensures that any distribution of relevance will be Gaussian. Unfortunately, this regime is also not the most relevant one, both in terms of applications of neural networks and in terms of mathematical i..
    corecore