THE SLLN FOR THE FREE-ENERGY OF A CLASS OF NEURAL NETWORKS

Abstract

We first show the self-averaging property in the sense of almost sure convergence for the free energy of the spin glass model and of the Hopfield model with an infinite number of patterns. Then we prove the strong law of large number(SLLN) of the free energy in the Hopfield type model with finite number of patterns. Here the Hopfield type model implies that the interaction among neurons is higher order, the patterns embedded in the neural network are assumed to be independent random variables rather than only taking value +1 and -1 and i.i.d. The model with weighted patterns is certainly included in. The SLLN of the free energy in the Little model is proved. The convergence rate for above two cases is also estimated

    Similar works