15 research outputs found

    Recurrent correlation associative memories

    Get PDF
    A model for a class of high-capacity associative memories is presented. Since they are based on two-layer recurrent neural networks and their operations depend on the correlation measure, these associative memories are called recurrent correlation associative memories (RCAMs). The RCAMs are shown to be asymptotically stable in both synchronous and asynchronous (sequential) update modes as long as their weighting functions are continuous and monotone nondecreasing. In particular, a high-capacity RCAM named the exponential correlation associative memory (ECAM) is proposed. The asymptotic storage capacity of the ECAM scales exponentially with the length of memory patterns, and it meets the ultimate upper bound for the capacity of associative memories. The asymptotic storage capacity of the ECAM with limited dynamic range in its exponentiation nodes is found to be proportional to that dynamic range. Design and fabrication of a 3-mm CMOS ECAM chip is reported. The prototype chip can store 32 24-bit memory patterns, and its speed is higher than one associative recall operation every 3 µs. An application of the ECAM chip to vector quantization is also described

    Recurrent correlation associative memories

    Get PDF
    A model for a class of high-capacity associative memories is presented. Since they are based on two-layer recurrent neural networks and their operations depend on the correlation measure, these associative memories are called recurrent correlation associative memories (RCAMs). The RCAMs are shown to be asymptotically stable in both synchronous and asynchronous (sequential) update modes as long as their weighting functions are continuous and monotone nondecreasing. In particular, a high-capacity RCAM named the exponential correlation associative memory (ECAM) is proposed. The asymptotic storage capacity of the ECAM scales exponentially with the length of memory patterns, and it meets the ultimate upper bound for the capacity of associative memories. The asymptotic storage capacity of the ECAM with limited dynamic range in its exponentiation nodes is found to be proportional to that dynamic range. Design and fabrication of a 3-mm CMOS ECAM chip is reported. The prototype chip can store 32 24-bit memory patterns, and its speed is higher than one associative recall operation every 3 µs. An application of the ECAM chip to vector quantization is also described

    A study of pattern recovery in recurrent correlation associative memories

    Get PDF
    In this paper, we analyze the recurrent correlation associative memory (RCAM) model of Chiueh and Goodman. This is an associative memory in which stored binary memory patterns are recalled via an iterative update rule. The update of the individual pattern-bits is controlled by an excitation function, which takes as its arguement the inner product between the stored memory patterns and the input patterns. Our contribution is to analyze the dynamics of pattern recall when the input patterns are corrupted by noise of a relatively unrestricted class. We make three contributions. First, we show how to identify the excitation function which maximizes the separation (the Fisher discriminant) between the uncorrupted realization of the noisy input pattern and the remaining patterns residing in the memory. Moreover, we show that the excitation function which gives maximum separation is exponential when the input bit-errors follow a binomial distribution. Our second contribution is to develop an expression for the expectation value of bit-error probability on the input pattern after one iteration. We show how to identify the excitation function which minimizes the bit-error probability. However, there is no closed-form solution and the excitation function must be recovered numerically. The relationship between the excitation functions which result from the two different approaches is examined for a binomial distribution of bit-errors. The final contribution is to develop a semiempirical approach to the modeling of the dynamics of the RCAM. This provides us with a numerical means of predicting the recall error rate of the memory. It also allows us to develop an expression for the storage capacity for a given recall error rate

    An Introduction to Quaternion-Valued Recurrent Projection Neural Networks

    Full text link
    Hypercomplex-valued neural networks, including quaternion-valued neural networks, can treat multi-dimensional data as a single entity. In this paper, we introduce the quaternion-valued recurrent projection neural networks (QRPNNs). Briefly, QRPNNs are obtained by combining the non-local projection learning with the quaternion-valued recurrent correlation neural network (QRCNNs). We show that QRPNNs overcome the cross-talk problem of QRCNNs. Thus, they are appropriate to implement associative memories. Furthermore, computational experiments reveal that QRPNNs exhibit greater storage capacity and noise tolerance than their corresponding QRCNNs.Comment: Accepted to be Published in: Proceedings of the 8th Brazilian Conference on Intelligent Systems (BRACIS 2019), October 15-18, 2019, Salvador, BA, Brazi

    Exponential fuzzy associative memories with application in classification

    Get PDF
    Orientador: Marcos Eduardo Ribeiro do Valle MesquitaTese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação CientíficaResumo: Memórias associativas são modelos matemáticos cujo principal objetivo é armazenar e recuperar informação por associação. Tais modelos são projetados para armazenar um conjunto finito de pares, chamado conjunto das memórias fundamentais, e devem apresentar certa tolerância a ruído, isto é, serem capazes de recuperar uma certa informação armazenada mesmo a partir de uma versão incompleta ou corrompida de um item memorizado. As memórias associativas recorrentes por correlação (RCAMs, do inglês Recurrent Correlation Associative Memories), introduzidas por Chiueh e Goodman, apresentam grande capacidade de armazenamento e excelente tolerância a ruído. Todavia, as RCAMs são projetadas para armazenar e recuperar padrões bipolares. As memórias associativas recorrentes exponenciais fuzzy generalizadas (GRE-FAMs, do inglês Generalized Recurrent Exponential Fuzzy Associative Memories) podem ser vistas como uma versão generalizada das RCAMs capazes de armazenar e recuperar conjuntos fuzzy. Nesta tese, introduzimos as memórias associativas bidirecionais exponenciais fuzzy generalizadas (GEB-FAMs, do inglês Generalized Exponential Bidirectional Fuzzy Associative Memories), uma extensão das GRE-FAMs para o caso heteroassociativo. Uma vez que as GEB-FAMs são baseadas em uma medida de similaridade, realizamos um estudo de diversas medidas de similaridade da literatura, dentre elas as medidas de similaridade baseadas em cardinalidade e a medida de similaridade estrutural (SSIM). Além disso, mostramos que as GEB-FAMs exibem ótima capacidade de armazenamento e apresentamos uma caracterização da saída de um passo das GEB-FAMs quando um dos seus parâmetros tende a infinito. No entanto, em experimentos computacionais, bons resultados foram obtidos por um único passo da GEB-FAM com valores do parâmetro no intervalo [1,10]. Como a dinâmica das GEB-FAMs ainda não está totalmente compreendida, este fato motivou um estudo mais aprofundado das GEB-FAMs de passo único, modelos denominados memórias associativas fuzzy com núcleo (fuzzy-KAM, do inglês fuzzy Kernel Associative Memories). Interpretamos este modelo utilizando um núcleo fuzzy e propomos ajustar seu parâmetro utilizando o conceito de entropia. Apresentamos também duas abordagens para classificação de padrões usando as fuzzy-KAMs. Finalmente, descrevemos os experimentos computacionais realizados para avaliar o desempenho de tais abordagens em problemas de classificação e reconhecimento de faces. Na maioria dos experimentos realizados, em ambos os tipos de problemas, os classificadores definidos com base nas abordagens propostas obtiveram desempenho satisfatório e competitivo com os obtidos por outros modelos da literatura, o que mostra a versatilidade de tais abordagensAbstract: Associative memories are mathematical models whose main objective is to store and recall information by association. Such models are designed for the storage a finite set of pairs, called fundamental memory set, and they must present certain noise tolerance, that is, they should be able to retrieve a stored information even from an incomplete or corrupted version of a memorized item. The recurrent correlation associative memories (RCAMs), introduced by Chiueh and Goodman, present large storage capacity and excellent noise tolerance. However, RCAMs are designed to store and retrieve bipolar patterns. The generalized recurrent exponential fuzzy associative memories (GRE-FAMs) can be seen as a generalized version of RCAMs capable of storing and retrieving fuzzy sets. In this thesis, we introduce the generalized exponential bidirectional fuzzy associative memories (GEB-FAMs), an extension of GRE-FAMs to the heteroassociative case. Since GEB-FAMs are based on a similarity measure, we conducted a study of several measures from the literature, including the cardinality based similarity measure and the structural similarity index (SSIM). Furthermore, we show that GEB-FAMs exhibit optimal storage capacity and we present a characterization of the output of a single-step GEB-FAM when one of its parameters tends to infinity. However, in computational experiments, good results were obtained by a single-step GEB-FAM with parameter values in the interval [1,10]. As the dynamics of the GEB-FAMs is still not fully understood, this fact led to a more detailed study of the single-step GEB-FAMs, refered to as fuzzy kernel associative memories (fuzzy-KAMs). We interpret this model by using a fuzzy kernel and we propose to adjust its parameter by using the concept of entropy. Also, we present two approaches to pattern classification using the fuzzy-KAMs. Finally, we describe computational experiments used to evaluate the performance of such approaches in classification and face recognition problems. In most of the experiments performed, in both types of problems, the classifiers defined based on the proposed approaches obtained satisfactory and competitive performance with those obtained by other models from the literature, which shows the versatility of such approachesDoutoradoMatematica AplicadaDoutora em Matemática Aplicada2015/00745-1CAPESFAPES

    A comparative study on associative memories with emphasis on morphological associative memories

    Get PDF
    Orientador: Peter SussnerDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação CientificaResumo: Memórias associativas neurais são modelos do fenômeno biológico que permite o armazenamento de padrões e a recordação destes apos a apresentação de uma versão ruidosa ou incompleta de um padrão armazenado. Existem vários modelos de memórias associativas neurais na literatura, entretanto, existem poucos trabalhos comparando as varias propostas. Nesta dissertação comparamos sistematicamente o desempenho dos modelos mais influentes de memórias associativas neurais encontrados na literatura. Esta comparação está baseada nos seguintes critérios: capacidade de armazenamento, distribuição da informação nos pesos sinápticos, raio da bacia de atração, memórias espúrias e esforço computacional. Especial ênfase dado para as memórias associativas morfológicas cuja fundamentação matemática encontra-se na morfologia matemática e na álgebra de imagensAbstract: Associative neural memories are models of biological phenomena that allow for the storage of pattern associations and the retrieval of the desired output pattern upon presentation of a possibly noisy or incomplete version of an input pattern. There are several models of neural associative memories in the literature, however, there are few works relating them. In this thesis, we present a systematic comparison of the performances of some of the most widely known models of neural associative memories. This comparison is based on the following criteria: storage capacity, distribution of the information over the synaptic weights, basin of attraction, number of spurious memories, and computational effort. The thesis places a special emphasis on morphological associative memories whose mathematical foundations lie in mathematical morphology and image algebraMestradoMatematica AplicadaMestre em Matemática Aplicad
    corecore