Abstract—In this paper, we evaluate the performance of descent conjugate gradient methods and we propose a new algorithm for training recurrent neural networks. The presented algorithm preserves the advantages of classical conjugate gradient methods while simultaneously avoids the usually inefficient restarts. Simulation results are also presented using three different recurrent neural network architectures in a variety of benchmarks. Index Terms—Recurrent neural networks, descent spectral conjugate gradient methods, sufficient descent property, performance profiles. I
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.