thesis

On the effect of dynamic adjustment of recurrent network parameters on learning

Abstract

The thesis examines sequential learning in a neural network model derived by M. I. Jordan and J. L. Elman. In each of three experiments, different network parameters are systematically altered in a series of simulations. Each simulation measures learning ability for a specific network configuration. Simulation results are consolidated to summarize each parameter\u27s significance in the learning process

    Similar works