4 research outputs found

    From Bidirectional Associative Memory to a noise-tolerant, robust Protein Processor Associative Memory

    Get PDF
    AbstractProtein Processor Associative Memory (PPAM) is a novel architecture for learning associations incrementally and online and performing fast, reliable, scalable hetero-associative recall. This paper presents a comparison of the PPAM with the Bidirectional Associative Memory (BAM), both with Kosko's original training algorithm and also with the more popular Pseudo-Relaxation Learning Algorithm for BAM (PRLAB). It also compares the PPAM with a more recent associative memory architecture called SOIAM. Results of training for object-avoidance are presented from simulations using player/stage and are verified by actual implementations on the E-Puck mobile robot. Finally, we show how the PPAM is capable of achieving an increase in performance without using the typical weighted-sum arithmetic operations or indeed any arithmetic operations

    Stability in N-Layer recurrent neural networks

    Get PDF
    Starting with the theory developed by Hopfield, Cohen-Grossberg and Kosko, the study of associative memories is extended to N - layer re-current neural networks. The stability of different multilayer networks is demonstrated under specified bounding hypotheses. The analysis involves theorems for the additive as well as the multiplicative models for continuous and discrete N - layer networks. These demonstrations are based on contin-uous and discrete Liapunov theory. The thesis develops autoassociative and heteroassociative memories. It points out the link between all recurrent net-works of this type. The discrete case is analyzed using the threshold signal function as the activation function. A general approach for studying the sta-bility and convergence of the multilayer recurrent networks is developed
    corecore