Article thumbnail
Location of Repository

Associative learning on a continuum in evolved dynamical networks

By Eduardo Izquierdo, Inman Harvey and Randall D Beer


This article extends previous work on evolving learning without synaptic plasticity from discrete tasks to continuous tasks. Continuous-time recurrent neural networks without synaptic plasticity are artificially evolved on an associative learning task. The task consists in associating paired stimuli: temperature and food. The temperature to be associated can be either drawn from a discrete set or allowed to range over a continuum of values. We address two questions: Can the learning without synaptic plasticity approach be extended to continuous tasks? And if so, how does learning without synaptic plasticity work in the evolved circuits? Analysis of the most successful circuits to learn discrete stimuli reveal finite state machine (FSM) like internal dynamics. However, when the task is modified to require learning stimuli on the full continuum range, it is not possible to extract a FSM from the internal dynamics. In this case, a continuous state machine is extracted instea

Year: 2008
OAI identifier:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.