research

Probing the basins of attraction of a recurrent neural network

Abstract

A recurrent neural network is considered that can retrieve a collection of patterns, as well as slightly perturbed versions of this `pure' set of patterns via fixed points of its dynamics. By replacing the set of dynamical constraints, i.e., the fixed point equations, by an extended collection of fixed-point-like equations, analytical expressions are found for the weights w_ij(b) of the net, which depend on a certain parameter b. This so-called basin parameter b is such that for b=0 there are, a priori, no perturbed patterns to be recognized by the net. It is shown by a numerical study, via probing sets, that a net constructed to recognize perturbed patterns, i.e., with values of the connections w_ij(b) with b unequal zero, possesses larger basins of attraction than a net made with the help of a pure set of patterns, i.e., with connections w_ij(b=0). The mathematical results obtained can, in principle, be realized by an actual, biological neural net.Comment: 17 pages, LaTeX, 2 figure

    Similar works