On the basis of the general form for the energy needed to adapt the
connection strengths of a network in which learning takes place, a local
learning rule is found for the changes of the weights. This biologically
realizable learning rule turns out to comply with Hebb's neuro-physiological
postulate, but is not of the form of any of the learning rules proposed in the
literature.
It is shown that, if a finite set of the same patterns is presented over and
over again to the network, the weights of the synapses converge to finite
values.
Furthermore, it is proved that the final values found in this biologically
realizable limit are the same as those found via a mathematical approach to the
problem of finding the weights of a partially connected neural network that can
store a collection of patterns. The mathematical solution is obtained via a
modified version of the so-called method of the pseudo-inverse, and has the
inverse of a reduced correlation matrix, rather than the usual correlation
matrix, as its basic ingredient. Thus, a biological network might realize the
final results of the mathematician by the energetically economic rule for the
adaption of the synapses found in this article.Comment: 29 pages, LaTeX, 3 figure