Skip to main content
Article thumbnail
Location of Repository

A Neural Model of Episodic and Semantic Spatiotemporal Memory

By Gerard J. Rinkus

Abstract

A neural network model is proposed that forms sparse spatiotemporal memory traces of spatiotemporal events given single occurrences of the events. The traces are distributed in that each individual cell and synapse participates in numerous traces. This sharing of representational substrate provides the basis for similarity-based generalization and thus semantic memory. Simulation results are provided demonstrating that similar spatiotemporal patterns map to similar traces. The model achieves this property by measuring the degree of match, G, between the current input pattern on each time slice and the expected input given the preceding time slices (i.e., temporal context) and then adding an amount of noise, inversely proportional to G, to the process of choosing the internal representation for the current time slice. Thus, if G is small, indicating novelty, we add much noise and the resulting internal representation of the current input pattern has low overlap with any preexisting representations of time slices. If G is large, indicating a familiar event, we add very little noise resulting in reactivation of all or most of the preexisting representation of the input pattern

Topics: Neural Nets, Artificial Intelligence
Publisher: Lawrence Erlbaum, Associates
Year: 2004
OAI identifier: oai:cogprints.org:3580
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://cogprints.org/3580/1/Co... (external link)
  • http://cogprints.org/3580/ (external link)
  • Suggested articles

    Citations

    1. (1999). Conjunctive Representations in Learning and Memory: Principles of Cortical and Hippocampal Function.
    2. (1995). Convergence-Zone Episodic Memory: Analysis and Simulations.
    3. (1987). Massively parallel architectures for a self-organizing neural pattern recognition machine.
    4. (2002). Preventing Catastrophic Interference in MultipleSequence Learning Using Coupled Reverberating Elman Networks.
    5. (1994). Sparse random networks with LTP learning rules approximate Bayes classifiers via Parzen’s method.
    6. (1986). Synapses, Circuits, and the Beginnings of Memory.
    7. (1995). TEMECOR: An Associative, Spatiotemporal Pattern Memory for Complex State Sequences.
    8. (1995). Why there are complementary learning systems in the hippocampus and neocortex: Insights from the successes and failures of connectionist models of learning and memory.

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.