Abstract — All experiments and results to be discovered in this paper have to be assessed at the crossroad of two basic lines of research: increasing the storing capacity of recurrent neural networks as much as possible and observing and studying how this increase impacts the dynamical regimes proposed by the net in order to allow such a huge storing. Seminal observations performed by Skarda and Freeman  on the olfactory bulb of rabbits during cognitive tasks have suggested to locate the basal state of behavior in the network’s spatiotemporal dynamics. Following the same idea, the information is stored in the network’s dynamical attractors. Two innovative learning algorithms are discussed and compared here, first an iterative supervised Hebbian learning algorithm were the all of the information is fully specified. Secondly, an iterative unsupervised Hebbian learning algorithm were the network has to categorize external stimuli, by building itself its own internal representations. Two different kinds of measures are compared, first their encoding capacity and robustness to noise, then the type of chaotic dynamics occurring (or not) when ambiguous stimuli are presented to the net. A problem occurring when coding information in cyclic attractors is to find an efficient way to retrieve the stored information. This paper suggests to process the information located in the internal representations of the net by relying on the phase synchrony. This enable to rely on static synchhronization patterns instead of cyclic attractors. This mechanism allows to incorporate easily the network in a more complex architecture. I
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.