Generalised scheme for optimal learning in recurrent neural networks

Abstract

A new learning scheme is proposed for neural network architectures like the Hopfield network and bidirectional associative memory. This scheme, which replaces the commonly used learning rules, follows from the proof of the result that learning in these connectivity architectures is equivalent to learning in the 2-state perceptron. Consequently, optimal learning algorithms for the perceptron can be directly applied to learning in these connectivity architectures. Similar results are established for learning in the multistate perceptron, thereby leading to an optimal learning algorithm. Experimental results are provided to show the superiority of the proposed method

    Similar works