106 research outputs found

    Dual coding with STDP in a spiking recurrent neural network model of the hippocampus.

    Get PDF
    The firing rate of single neurons in the mammalian hippocampus has been demonstrated to encode for a range of spatial and non-spatial stimuli. It has also been demonstrated that phase of firing, with respect to the theta oscillation that dominates the hippocampal EEG during stereotype learning behaviour, correlates with an animal's spatial location. These findings have led to the hypothesis that the hippocampus operates using a dual (rate and temporal) coding system. To investigate the phenomenon of dual coding in the hippocampus, we examine a spiking recurrent network model with theta coded neural dynamics and an STDP rule that mediates rate-coded Hebbian learning when pre- and post-synaptic firing is stochastic. We demonstrate that this plasticity rule can generate both symmetric and asymmetric connections between neurons that fire at concurrent or successive theta phase, respectively, and subsequently produce both pattern completion and sequence prediction from partial cues. This unifies previously disparate auto- and hetero-associative network models of hippocampal function and provides them with a firmer basis in modern neurobiology. Furthermore, the encoding and reactivation of activity in mutually exciting Hebbian cell assemblies demonstrated here is believed to represent a fundamental mechanism of cognitive processing in the brain

    IST Austria Thesis

    Get PDF
    CA3 pyramidal neurons are thought to pay a key role in memory storage and pattern completion by activity-dependent synaptic plasticity between CA3-CA3 recurrent excitatory synapses. To examine the induction rules of synaptic plasticity at CA3-CA3 synapses, we performed whole-cell patch-clamp recordings in acute hippocampal slices from rats (postnatal 21-24 days) at room temperature. Compound excitatory postsynaptic potentials (ESPSs) were recorded by tract stimulation in stratum oriens in the presence of 10 µM gabazine. High-frequency stimulation (HFS) induced N-methyl-D-aspartate (NMDA) receptor-dependent long-term potentiation (LTP). Although LTP by HFS did not requier postsynaptic spikes, it was blocked by Na+-channel blockers suggesting that local active processes (e.g.) dendritic spikes) may contribute to LTP induction without requirement of a somatic action potential (AP). We next examined the properties of spike timing-dependent plasticity (STDP) at CA3-CA3 synapses. Unexpectedly, low-frequency pairing of EPSPs and backpropagated action potentialy (bAPs) induced LTP, independent of temporal order. The STDP curve was symmetric and broad, with a half-width of ~150 ms. Consistent with these specific STDP induction properties, post-presynaptic sequences led to a supralinear summation of spine [Ca2+] transients. Furthermore, in autoassociative network models, storage and recall was substantially more robust with symmetric than with asymmetric STDP rules. In conclusion, we found associative forms of LTP at CA3-CA3 recurrent collateral synapses with distinct induction rules. LTP induced by HFS may be associated with dendritic spikes. In contrast, low frequency pairing of pre- and postsynaptic activity induced LTP only if EPSP-AP were temporally very close. Together, these induction mechanisms of synaptiic plasticity may contribute to memory storage in the CA3-CA3 microcircuit at different ranges of activity

    Synaptic Plasticity and Hebbian Cell Assemblies

    Get PDF
    Synaptic dynamics are critical to the function of neuronal circuits on multiple timescales. In the first part of this dissertation, I tested the roles of action potential timing and NMDA receptor composition in long-term modifications to synaptic efficacy. In a computational model I showed that the dynamics of the postsynaptic [Ca2+] time course can be used to map the timing of pre- and postsynaptic action potentials onto experimentally observed changes in synaptic strength. Using dual patch-clamp recordings from cultured hippocampal neurons, I found that NMDAR subtypes can map combinations of pre- and postsynaptic action potentials onto either long-term potentiation (LTP) or depression (LTD). LTP and LTD could even be evoked by the same stimuli, and in such cases the plasticity outcome was determined by the availability of NMDAR subtypes. The expression of LTD was increasingly presynaptic as synaptic connections became more developed. Finally, I found that spike-timing-dependent potentiability is history-dependent, with a non-linear relationship to the number of pre- and postsynaptic action potentials. After LTP induction, subsequent potentiability recovered on a timescale of minutes, and was dependent on the duration of the previous induction. While activity-dependent plasticity is putatively involved in circuit development, I found that it was not required to produce small networks capable of exhibiting rhythmic persistent activity patterns called reverberations. However, positive synaptic scaling produced by network inactivity yielded increased quantal synaptic amplitudes, connectivity, and potentiability, all favoring reverberation. These data suggest that chronic inactivity upregulates synaptic efficacy by both quantal amplification and by the addition of silent synapses, the latter of which are rapidly activated by reverberation. Reverberation in previously inactivated networks also resulted in activity-dependent outbreaks of spontaneous network activity. Applying a model of short-term synaptic dynamics to the network level, I argue that these experimental observations can be explained by the interaction between presynaptic calcium dynamics and short-term synaptic depression on multiple timescales. Together, the experiments and modeling indicate that ongoing activity, synaptic scaling and metaplasticity are required to endow networks with a level of synaptic connectivity and potentiability that supports stimulus-evoked persistent activity patterns but avoids spontaneous activity

    Functional Brain Oscillations: How Oscillations Facilitate Information Representation and Code Memories

    Get PDF
    The overall aim of the modelling works within this thesis is to lend theoretical evidence to empirical findings from the brain oscillations literature. We therefore hope to solidify and expand the notion that precise spike timing through oscillatory mechanisms facilitates communication, learning, information processing and information representation within the brain. The primary hypothesis of this thesis is that it can be shown computationally that neural de-synchronisations can allow information content to emerge. We do this using two neural network models, the first of which shows how differential rates of neuronal firing can indicate when a single item is being actively represented. The second model expands this notion by creating a complimentary timing mechanism, thus enabling the emergence of qualitive temporal information when a pattern of items is being actively represented. The secondary hypothesis of this thesis is that it can be also be shown computationally that oscillations might play a functional role in learning. Both of the models presented within this thesis propose a sparsely coded and fast learning hippocampal region that engages in the binding of novel episodic information. The first model demonstrates how active cortical representations enable learning to occur in their hippocampal counterparts via a phase-dependent learning rule. The second model expands this notion, creating hierarchical temporal sequences to encode the relative temporal position of cortical representations. We demonstrate in both of these models, how cortical brain oscillations might provide a gating function to the representation of information, whilst complimentary hippocampal oscillations might provide distinct phasic reference points for learning

    What is memory? The present state of the engram

    Get PDF
    The mechanism of memory remains one of the great unsolved problems of biology. Grappling with the question more than a hundred years ago, the German zoologist Richard Semon formulated the concept of the engram, lasting connections in the brain that result from simultaneous "excitations", whose precise physical nature and consequences were out of reach of the biology of his day. Neuroscientists now have the knowledge and tools to tackle this question, however, and this Forum brings together leading contemporary views on the mechanisms of memory and what the engram means today

    Improving Associative Memory in a Network of Spiking Neurons

    Get PDF
    In this thesis we use computational neural network models to examine the dynamics and functionality of the CA3 region of the mammalian hippocampus. The emphasis of the project is to investigate how the dynamic control structures provided by inhibitory circuitry and cellular modification may effect the CA3 region during the recall of previously stored information. The CA3 region is commonly thought to work as a recurrent auto-associative neural network due to the neurophysiological characteristics found, such as, recurrent collaterals, strong and sparse synapses from external inputs and plasticity between coactive cells. Associative memory models have been developed using various configurations of mathematical artificial neural networks which were first developed over 40 years ago. Within these models we can store information via changes in the strength of connections between simplified model neurons (two-state). These memories can be recalled when a cue (noisy or partial) is instantiated upon the net. The type of information they can store is quite limited due to restrictions caused by the simplicity of the hard-limiting nodes which are commonly associated with a binary activation threshold. We build a much more biologically plausible model with complex spiking cell models and with realistic synaptic properties between cells. This model is based upon some of the many details we now know of the neuronal circuitry of the CA3 region. We implemented the model in computer software using Neuron and Matlab and tested it by running simulations of storage and recall in the network. By building this model we gain new insights into how different types of neurons, and the complex circuits they form, actually work. The mammalian brain consists of complex resistive-capacative electrical circuitry which is formed by the interconnection of large numbers of neurons. A principal cell type is the pyramidal cell within the cortex, which is the main information processor in our neural networks. Pyramidal cells are surrounded by diverse populations of interneurons which have proportionally smaller numbers compared to the pyramidal cells and these form connections with pyramidal cells and other inhibitory cells. By building detailed computational models of recurrent neural circuitry we explore how these microcircuits of interneurons control the flow of information through pyramidal cells and regulate the efficacy of the network. We also explore the effect of cellular modification due to neuronal activity and the effect of incorporating spatially dependent connectivity on the network during recall of previously stored information. In particular we implement a spiking neural network proposed by Sommer and Wennekers (2001). We consider methods for improving associative memory recall using methods inspired by the work by Graham and Willshaw (1995) where they apply mathematical transforms to an artificial neural network to improve the recall quality within the network. The networks tested contain either 100 or 1000 pyramidal cells with 10% connectivity applied and a partial cue instantiated, and with a global pseudo-inhibition.We investigate three methods. Firstly, applying localised disynaptic inhibition which will proportionalise the excitatory post synaptic potentials and provide a fast acting reversal potential which should help to reduce the variability in signal propagation between cells and provide further inhibition to help synchronise the network activity. Secondly, implementing a persistent sodium channel to the cell body which will act to non-linearise the activation threshold where after a given membrane potential the amplitude of the excitatory postsynaptic potential (EPSP) is boosted to push cells which receive slightly more excitation (most likely high units) over the firing threshold. Finally, implementing spatial characteristics of the dendritic tree will allow a greater probability of a modified synapse existing after 10% random connectivity has been applied throughout the network. We apply spatial characteristics by scaling the conductance weights of excitatory synapses which simulate the loss in potential in synapses found in the outer dendritic regions due to increased resistance. To further increase the biological plausibility of the network we remove the pseudo-inhibition and apply realistic basket cell models with differing configurations for a global inhibitory circuit. The networks are configured with; 1 single basket cell providing feedback inhibition, 10% basket cells providing feedback inhibition where 10 pyramidal cells connect to each basket cell and finally, 100% basket cells providing feedback inhibition. These networks are compared and contrasted for efficacy on recall quality and the effect on the network behaviour. We have found promising results from applying biologically plausible recall strategies and network configurations which suggests the role of inhibition and cellular dynamics are pivotal in learning and memory

    Oscillations and Episodic Memory: Addressing the Synchronization/Desynchronization Conundrum

    Get PDF
    Brain oscillations are one of the core mechanisms underlying episodic memory. However, while some studies highlight the role of synchronized oscillatory activity, others highlight the role of desynchronized activity. We here describe a framework to resolve this conundrum and integrate these two opposing oscillatory behaviors. Specifically, we argue that the synchronization and desynchronization reflect a division of labor between a hippocampal and a neocortical system, respectively. We describe a novel oscillatory framework that integrates synchronization and desynchronization mechanisms to explain how the two systems interact in the service of episodic memory

    Memory stability and synaptic plasticity

    Get PDF
    Numerous experiments have demonstrated that the activity of neurons can alter the strength of excitatory synapses. This synaptic plasticity is bidirectional and synapses can be strengthened (potentiation) or weakened (depression). Synaptic plasticity offers a mechanism that links the ongoing activity of the brain with persistent physical changes to its structure. For this reason it is widely believed that synaptic plasticity mediates learning and memory. The hypothesis that synapses store memories by modifying their strengths raises an important issue. There should be a balance between the necessity that synapses change frequently, allowing new memories to be stored with high fidelity, and the necessity that synapses retain previously stored information. This is the plasticity stability dilemma. In this thesis the plasticity stability dilemma is studied in the context of the two dominant paradigms of activity dependent synaptic plasticity: Spike timing dependent plasticity (STDP) and long term potentiation and depression (LTP/D). Models of biological synapses are analysed and processes that might ameliorate the plasticity stability dilemma are identified. Two popular existing models of STDP are compared. Through this comparison it is demonstrated that the synaptic weight dynamics of STDP has a large impact upon the retention time of correlation between the weights of a single neuron and a memory. In networks it is shown that lateral inhibition stabilises the synaptic weights and receptive fields. To analyse LTP a novel model of LTP/D is proposed. The model centres on the distinction between early LTP/D, when synaptic modifications are persistent on a short timescale, and late LTP/D when synaptic modifications are persistent on a long timescale. In the context of the hippocampus it is proposed that early LTP/D allows the rapid and continuous storage of short lasting memory traces over a long lasting trace established with late LTP/D. It is shown that this might confer a longer memory retention time than in a system with only one phase of LTP/D. Experimental predictions about the dynamics of amnesia based upon this model are proposed. Synaptic tagging is a phenomenon whereby early LTP can be converted into late LTP, by subsequent induction of late LTP in a separate but nearby input. Synaptic tagging is incorporated into the LTP/D framework. Using this model it is demonstrated that synaptic tagging could lead to the conversion of a short lasting memory trace into a longer lasting trace. It is proposed that this allows the rescue of memory traces that were initially destined for complete decay. When combined with early and late LTP/D iii synaptic tagging might allow the management of hippocampal memory traces, such that not all memories must be stored on the longest, most stable late phase timescale. This lessens the plasticity stability dilemma in the hippocampus, where it has been hypothesised that memory traces must be frequently and vividly formed, but that not all traces demand eventual consolidation at the systems level
    • …
    corecore