515 research outputs found

    Hierarchical Associative Memory Based on Oscillatory Neural Network

    Get PDF
    In this thesis we explore algorithms and develop architectures based on emerging nano-device technologies for cognitive computing tasks such as recognition, classification, and vision. In particular we focus on pattern matching in high dimensional vector spaces to address the nearest neighbor search problem. Recent progress in nanotechnology provides us novel nano-devices with special nonlinear response characteristics that fit cognitive tasks better than general purpose computing. We build an associative memory (AM) by weakly coupling nano-oscillators as an oscillatory neural network and design a hierarchical tree structure to organize groups of AM units. For hierarchical recognition, we first examine an architecture where image patterns are partitioned into different receptive fields and processed by individual AM units in lower levels, and then abstracted using sparse coding techniques for recognition at higher levels. A second tree structure model is developed as a more scalable AM architecture for large data sets. In this model, patterns are classified by hierarchical k-means clustering and organized in hierarchical clusters. Then the recognition process is done by comparison between the input patterns and centroids identified in the clustering process. The tree is explored in a "depth-only" manner until the closest image pattern is output. We also extend this search technique to incorporate a branch-and-bound algorithm. The models and corresponding algorithms are tested on two standard face recognition data-sets. We show that the depth-only hierarchical model is very data-set dependent and performs with 97% or 67% recognition when compared to a single large associative memory, while the branch and bound search increases time by only a factor of two compared to the depth-only search

    The storage of semantic memories in the cortex: a computational study

    Get PDF
    The main object of this thesis is the design of structured distributed memories for the purpose of studying their storage and retrieval properties in large scale cortical auto-associative networks. For this, an autoassociative network of Potts units, coupled via tensor connections, has been proposed and analyzed as an effective model of an extensive cortical network with distinct short and long-range synaptic connections. Recently, we have clarified in what sense it can be regarded as an effective model. While the fully-connected (FC) and the very sparsely connected, that is, highly diluted (HD) limits of the model have thoroughly analyzed, the realistic case of the intermediate partial connectivity has been simply assumed to interpolate the FC and HD cases. In this thesis, we first study the storage capacity of Potts network with such intermediate connectivity. We corroborate the outcome of the analysis by showing that the resulting mean field equations are consistent with the FC and HD equations under the appropriate limits. The mean-field equations are only derived for randomly diluted connectivity (RD). Through simulations, we also study symmetric dilution (SD) and state dependent random dilution (SDRD). We find that the Potts network has a higher capacity for symmetric than for random dilution. We then turn to the core question: how to use a model originally conceived for the storage of p unrelated patterns of activity, in order to study semantic memory, which is organized in terms of the relations between the facts and the attributes of real-world knowledge. To proceed, we first formulate a mathematical model of generating patterns with correlations, as an extension of a hierarchical procedure for generating ultrametrically organized patterns. The model ascribes the correlations between patterns to the influence of underlying "factors"; if many factors act with comparable strength, their influences balance out and correlations are low; whereas if a few factors dominate, which in the model occurs for increasing values of a control parameter \u3b6, correlations between memory patterns can become much stronger. We show that the extension allows for correlations between patterns that are neither trivial (as in the random case) nor a plain tree (as in the ultrametric case), but that are highly sensitive to the values of the correlation parameters that we define. Next, we study the storage capacity of the Potts network when the patterns are correlated by way of our algorithm. We show that fewer correlated patterns can be stored and retrieved than random ones, and that the higher the degree of correlation, the lower the capacity. We find that the mean-field equations yielding the storage capacity are different from those obtained with uncorrelated patterns through only an additional term in the noise, proportional to the number of learned patterns p and to the difference between the average correlation between correlated patterns and independently generated patterns of the same sparsity. Of particular interest is the role played by the parameter we have introduced, \u3b6, which controls the strength of the influences of different factors (the "parents") in generating the memory patterns (the "children"). In particular, we find that for high values of \u3b6, so that only a handful of parents are effective, the network exhibits correlated retrieval, in which the network, though not being able to retrieve the pattern cued, settles into a configuration of high overlap with another pattern. This behavior of the network can be interpreted as reflecting the semantic structure of the correlations, in which even after capacity collapse, what the network can still do is to recognize the strongest features associated with the pattern. This observation is better quantified using the mutual information between the pattern cued and the configuration the network settles into, after retrieval dynamics. This information is found to increase from zero to a non-zero value abruptly when increasing the parameter \u3b6, akin to a phase transition. Two alternative phases are then identified, \u3b6 \u3b6 c , memories form clusters, such that while the specifics of the cued pattern cannot be retrieved, some of the structure informing the cluster of memories can still be retrieved. In a final short chapter, we attempt to understand the implications of having stored correlated memories on latching dynamics, the spontaneous behavior which has been proposed to be an emergent property, beyond the simple cued retrieval paradigm, of large cortical networks. Progress made in this direction, studying the Potts network, has so far focused on uncorrelated memories. Introducing correlations, we find a rich phase space of behaviors, from sequential retrieval of memories, to parallel retrieval of clusters of highly correlated memories and oscillations, depending on the various correlation parameters. The parameters of our algorithm may be found to emerge as critical control parameters, corresponding to the statistical features in human semantic memory most important in determining the dynamics of our trains of thoughts

    Functional segregation of hippocampal subdivisions in learning and memory

    Get PDF
    The hippocampus is well known for its function in declarative memories, especially in episodic memories and spatial navigation. Considering strikingly different features along its longitudinal axis from dorsal to ventral hippocampus, it has been proposed that hippocampal subdivisions might have distinct functional roles. Among several hypotheses, a particular prominent one is that dorsal hippocampus is required for cognitive functions, while ventral hippocampus is involved in emotional learning and stress responses, but their precise roles in learning and memory have remained controversial. In this thesis, I further explore the idea of a functional segregation, focusing on the roles of dorsal and ventral hippocampus in different types of declarative memories. Therefore, I use chemogenetic silencing to locally interfere with subdivision function in reinforced and incidental learning at various time points after memory acquisition and at memory retrieval. First, I compare the functions of dorsal and ventral hippocampus in single-trial learning. Then, I am addressing their roles in the formation of associations to previously acquired memories. Moreover, applying chemogenetic silencing and powerful recently developed techniques to genetically target learning-related neuronal populations, I study the localization of single-trial and association memories within the hippocampus. I show how in all hippocampus-dependent tasks both dorsal and ventral hippocampus is required, but with distinct contributions and irrespective of emotional relevance. Specifically, ventral hippocampus is involved in forming and recalling primary associations, whereas dorsal hippocampus is particularly important during a window of 5h post new learning. During this window dorsal hippocampus recalls memories and forms secondary associations learned on top of previously acquired memories. Thereby, the subdivisions provide a mechanism to recall previously acquired memories and to form associations to them without interference of memories, but instead with the possibility to independently use the distinct memory components. In a supplementary part, I have started to investigate the function of the transversal hippocampal axis, in particular the dentate gyrus in association learning. This study allows a first insight into a possible mechanism that might shape memory assemblies to form associations
    • …
    corecore