Skip to main content
Article thumbnail
Location of Repository

c○GRACM A SPECTRAL VERSION OF PERRY’S CONJUGATE GRADIENT METHOD FOR NEURAL NETWORK TRAINING

By D. G. Sotiropoulos, A. E. Kostopoulos and T. N. Grapsa

Abstract

Abstract. In this work, an efficient training algorithm for feedforward neural networks is presented. It is based on a scaled version of the conjugate gradient method suggested by Perry, which employs the spectral steplength of Barzilai and Borwein that contains second order information without estimating the Hessian matrix. The learning rate is automatically adapted at each epoch, using the conjugate gradient values and the learning rate of the previous one. In addition, a new acceptability criterion for the learning rate is utilized based on non-monotone Wolfe conditions. The efficiency of the training algorithm is proved on the standard tests, including XOR, 3-bit parity, font learning and function approximation problems.

Topics: back propagation, supervised training, conjugate gradient methods, Perry’s method, spectral steplength, non-monotone Wolfe conditions
Year: 2002
OAI identifier: oai:CiteSeerX.psu:10.1.1.305.515
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.math.upatras.gr/~dg... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.