Location of Repository

Evolving neural networks using matrix grammars

By Ian Roberts and Andrew Hunter


Methods of evolving Neural Networks using Matrix Grammars are described. Because these methods generate network architectures structurally, reusing symbols to describe sub-sections of the architecture, they tend to produce well-structured networks and are suitable for similarly well-structured problems. Methods which generate the architecture only, and methods which also generate weights, are described. Evolution is combined with backpropagation training. The techniques are compared with previously published work, and show several distinct advantages. The main advantage of all the methods is the ability to overcome the Genetic Algorithm scaling problem. The inclusion of weights gives better convergence. The Matrix Grammars presented here further separate the evolution of weights and architecture than previous methods, widening the search-space. The suitability of the techniques for more substantial problems is discussed. We also show how large improvements can be achieved by progressive evolution: the pretraining of the population on related, simpler problems

Topics: G400 Computer Science
Publisher: University of Sunderland
Year: 1999
OAI identifier: oai:eprints.lincoln.ac.uk:3387
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://eprints.lincoln.ac.uk/3... (external link)
  • http://eprints.lincoln.ac.uk/3... (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.