979 research outputs found

    Noise facilitation in associative memories of exponential capacity

    Get PDF
    Recent advances in associative memory design through structured pattern sets and graph-based inference al- gorithms have allowed reliable learning and recall of an exponential number of patterns. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in brain regions thought to operate associatively such as hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as the internal noise level is below a specified threshold, the error probability in the recall phase can be made exceedingly small. More surprisingly, we show that internal noise actually improves the performance of the recall phase while the pattern retrieval capacity remains intact, i.e., the number of stored patterns does not reduce with noise (up to a threshold). Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks

    On palimpsests in neural memory: an information theory viewpoint

    Full text link
    The finite capacity of neural memory and the reconsolidation phenomenon suggest it is important to be able to update stored information as in a palimpsest, where new information overwrites old information. Moreover, changing information in memory is metabolically costly. In this paper, we suggest that information-theoretic approaches may inform the fundamental limits in constructing such a memory system. In particular, we define malleable coding, that considers not only representation length but also ease of representation update, thereby encouraging some form of recycling to convert an old codeword into a new one. Malleability cost is the difficulty of synchronizing compressed versions, and malleable codes are of particular interest when representing information and modifying the representation are both expensive. We examine the tradeoff between compression efficiency and malleability cost, under a malleability metric defined with respect to a string edit distance. This introduces a metric topology to the compressed domain. We characterize the exact set of achievable rates and malleability as the solution of a subgraph isomorphism problem. This is all done within the optimization approach to biology framework.Accepted manuscrip

    Contributions of synaptic filters to models of synaptically stored memory

    No full text
    The question of how neural systems encode memories in one-shot without immediately disrupting previously stored information has puzzled theoretical neuroscientists for years and it is the central topic of this thesis. Previous attempts on this topic, have proposed that synapses probabilistically update in response to plasticity inducing stimuli to effectively delay the degradation of old memories in the face of ongoing memory storage. Indeed, experiments have shown that synapses do not immediately respond to plasticity inducing stimuli, since these must be presented many times before synaptic plasticity is expressed. Such a delay could be due to the stochastic nature of synaptic plasticity or perhaps because induction signals are integrated before overt strength changes occur.The later approach has been previously applied to control fluctuations in neural development by low-pass filtering induction signals before plasticity is expressed. In this thesis we consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals to a threshold before expressing plasticity. We report novel recall dynamics and considerable improvements in memory lifetimes against a prominent model of synaptically stored memory. With integrating synapses the memory trace initially rises before reaching a maximum and then falls. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. Furthermore, we find that integrating synapses possess natural timescales that can be used to consider the transition to late-phase plasticity under spaced repetition patterns known to lead to optimal storage conditions. We find that threshold crossing statistics differentiate between massed and spaced memory repetition patterns. However, isolated integrative synapses obtain an insufficient statistical sample to detect the stimulation pattern within a few memory repetitions. We extend the modelto consider the cooperation of well-known intracellular signalling pathways in detecting storage conditions by utilizing the profile of postsynaptic depolarization. We find that neuron wide signalling and local synaptic signals can be combined to detect optimal storage conditions that lead to stable forms of plasticity in a synapse specific manner.These models can be further extended to consider heterosynaptic and neuromodulatory interactions for late-phase plasticity.<br/

    A General Hippocampal Computational Model Combining Episodic and Spatial Memory in a Spiking Model

    Get PDF
    Institute for Adaptive and Neural ComputationThe hippocampus, in humans and rats, plays crucial roles in spatial tasks and nonspatial tasks involving episodic-type memory. This thesis presents a novel computational model of the hippocampus (CA1, CA3 and dentate gyrus) which creates a framework where spatial memory and episodic memory are explained together. This general model follows the approach where the memory function of the rodent hippocampus is seen as a “memory space” instead of a “spatial memory”. The innovations of this novel model are centred around the fact that it follows detailed hippocampal architecture constraints and uses spiking networks to represent all hippocampal subfields. This hippocampal model does not require stable attractor states to produce a robust memory system capable of pattern separation and pattern completion. In this hippocampal theory, information is represented and processed in the form of activity patterns. That is, instead of assuming firing-rate coding, this model assumes that information is coded in the activation of specific constellations of neurons. This coding mechanism, associated with the use of spiking neurons, raises many problems on how information is transferred, processed and stored in the different hippocampal subfields. This thesis explores which mechanisms are available in the hippocampus to achieve such control, and produces a detailed model which is biologically realistic and capable of explaining how several computational components can work together to produce the emergent functional properties of the hippocampus. In this hippocampal theory, precise explanations are given to why mossy fibres are important for storage but not recall, what is the functional role of the mossy cells (excitatory interneurons) in the dentate gyrus, why firing fields can be asymmetric with the firing peak closer to the end of the field, which features are used to produce “place fields”, among others. An important property of this hippocampal model is that the memory system provided by the CA3 is a palimpsest memory: after saturation, the number of patterns that can be recalled is independent of the number of patterns engraved in the recurrent network. In parallel with the development of the hippocampal computational model, a simulation environment was created. This simulation environment was tailored for the needs and assumptions of the hippocampal model and represents an important component of this thesis

    How Memory Conforms to Brain Development

    Get PDF
    Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between “form” and “function” spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.Financial support from the Spanish Ministry of Science and Technology, and the Agencia Española de Investigación (AEI) under grant FIS2017-84256-P (FEDER funds) and from the Obra Social La Caixa (ID 100010434, with code LCF/BQ/ES15/10360004). This study has been also partially financed by the Consejería de Conocimiento, Investigación y Universidad, Junta de Andalucía and European Regional Development Fund (ERDF), with reference SOMM17/6105/UGR
    corecore