1,938 research outputs found

    Contributions of synaptic filters to models of synaptically stored memory

    No full text
    The question of how neural systems encode memories in one-shot without immediately disrupting previously stored information has puzzled theoretical neuroscientists for years and it is the central topic of this thesis. Previous attempts on this topic, have proposed that synapses probabilistically update in response to plasticity inducing stimuli to effectively delay the degradation of old memories in the face of ongoing memory storage. Indeed, experiments have shown that synapses do not immediately respond to plasticity inducing stimuli, since these must be presented many times before synaptic plasticity is expressed. Such a delay could be due to the stochastic nature of synaptic plasticity or perhaps because induction signals are integrated before overt strength changes occur.The later approach has been previously applied to control fluctuations in neural development by low-pass filtering induction signals before plasticity is expressed. In this thesis we consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals to a threshold before expressing plasticity. We report novel recall dynamics and considerable improvements in memory lifetimes against a prominent model of synaptically stored memory. With integrating synapses the memory trace initially rises before reaching a maximum and then falls. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. Furthermore, we find that integrating synapses possess natural timescales that can be used to consider the transition to late-phase plasticity under spaced repetition patterns known to lead to optimal storage conditions. We find that threshold crossing statistics differentiate between massed and spaced memory repetition patterns. However, isolated integrative synapses obtain an insufficient statistical sample to detect the stimulation pattern within a few memory repetitions. We extend the modelto consider the cooperation of well-known intracellular signalling pathways in detecting storage conditions by utilizing the profile of postsynaptic depolarization. We find that neuron wide signalling and local synaptic signals can be combined to detect optimal storage conditions that lead to stable forms of plasticity in a synapse specific manner.These models can be further extended to consider heterosynaptic and neuromodulatory interactions for late-phase plasticity.<br/

    Rate and Pulse Based Plasticity Governed by Local Synaptic State Variables

    Get PDF
    Classically, action-potential-based learning paradigms such as the Bienenstock–Cooper–Munroe (BCM) rule for pulse rates or spike timing-dependent plasticity for pulse pairings have been experimentally demonstrated to evoke long-lasting synaptic weight changes (i.e., plasticity). However, several recent experiments have shown that plasticity also depends on the local dynamics at the synapse, such as membrane voltage, Calcium time course and level, or dendritic spikes. In this paper, we introduce a formulation of the BCM rule which is based on the instantaneous postsynaptic membrane potential as well as the transmission profile of the presynaptic spike. While this rule incorporates only simple local voltage- and current dynamics and is thus neither directly rate nor timing based, it can replicate a range of experiments, such as various rate and spike pairing protocols, combinations of the two, as well as voltage-dependent plasticity. A detailed comparison of current plasticity models with respect to this range of experiments also demonstrates the efficacy of the new plasticity rule. All experiments can be replicated with a limited set of parameters, avoiding the overfitting problem of more involved plasticity rules

    Information processing in biological complex systems: a view to bacterial and neural complexity

    Get PDF
    This thesis is a study of information processing of biological complex systems seen from the perspective of dynamical complexity (the degree of statistical independence of a system as a whole with respect to its components due to its causal structure). In particular, we investigate the influence of signaling functions in cell-to-cell communication in bacterial and neural systems. For each case, we determine the spatial and causal dependencies in the system dynamics from an information-theoretic point of view and we relate it with their physiological capabilities. The main research content is presented into three main chapters. First, we study a previous theoretical work on synchronization, multi-stability, and clustering of a population of coupled synthetic genetic oscillators via quorum sensing. We provide an extensive numerical analysis of the spatio-temporal interactions, and determine conditions in which the causal structure of the system leads to high dynamical complexity in terms of associated metrics. Our results indicate that this complexity is maximally receptive at transitions between dynamical regimes, and maximized for transient multi-cluster oscillations associated with chaotic behaviour. Next, we introduce a model of a neuron-astrocyte network with bidirectional coupling using glutamate-induced calcium signaling. This study is focused on the impact of the astrocyte-mediated potentiation on synaptic transmission. Our findings suggest that the information generated by the joint activity of the population of neurons is irreducible to its independent contribution due to the role of astrocytes. We relate these results with the shared information modulated by the spike synchronization imposed by the bidirectional feedback between neurons and astrocytes. It is shown that the dynamical complexity is maximized when there is a balance between the spike correlation and spontaneous spiking activity. Finally, the previous observations on neuron-glial signaling are extended to a large-scale system with community structure. Here we use a multi-scale approach to account for spatiotemporal features of astrocytic signaling coupled with clusters of neurons. We investigate the interplay of astrocytes and spiking-time-dependent-plasticity at local and global scales in the emergence of complexity and neuronal synchronization. We demonstrate the utility of astrocytes and learning in improving the encoding of external stimuli as well as its ability to favour the integration of information at synaptic timescales to exhibit a high intrinsic causal structure at the system level. Our proposed approach and observations point to potential effects of the astrocytes for sustaining more complex information processing in the neural circuitry

    Variations on the theme of synaptic filtering: a comparison of integrate-and-express models of synaptic plasticity for memory lifetimes

    No full text
    Integrate-and-express models of synaptic plasticity propose that synapses integrate plasticity induction signals before expressing synaptic plasticity. By discerning trends in their induction signals, synapses can control destabilizing fluctuations in synaptic strength. In a feedforward perceptron framework with binary-strength synapses for associative memory storage, we have previously shown that such a filter-based model outperforms other, nonintegrative, “cascade”-type models of memory storage in most regions of biologically relevant parameter space. Here, we consider some natural extensions of our earlier filter model, including one specifically tailored to binary-strength synapses and one that demands a fixed, consecutive number of same-type induction signals rather than merely an excess before expressing synaptic plasticity. With these extensions, we show that filter-based models outperform nonintegrative models in all regions of biologically relevant parameter space except for a small sliver in which all models encode memories only weakly. In this sliver, which model is superior depends on the metric used to gauge memory lifetimes (whether a signal-to-noise ratio or a mean first passage time). After comparing and contrasting these various filter models, we discuss the multiple mechanisms and timescales that underlie both synaptic plasticity and memory phenomena and suggest that multiple, different filtering mechanisms may operate at single synapses

    A Novel Learning Rule for Long-Term Plasticity of Short-Term Synaptic Plasticity Enhances Temporal Processing

    Get PDF
    It is well established that short-term synaptic plasticity (STP) of neocortical synapses is itself plastic – e.g., the induction of LTP and LTD tend to shift STP towards short-term depression and facilitation, respectively. What has not been addressed theoretically or experimentally is whether STP is “learned”; that is, is STP regulated by specific learning rules that are in place to optimize the computations performed at synapses, or, are changes in STP essentially an epiphenomenon of long-term plasticity? Here we propose that STP is governed by specific learning rules that operate independently and in parallel of the associative learning rules governing baseline synaptic strength. We describe a learning rule for STP and, using simulations, demonstrate that it significantly enhances the discrimination of spatiotemporal stimuli. Additionally we generate a set of experimental predictions aimed at testing our hypothesis

    The enhanced rise and delayed fall of memory in a model of synaptic integration: extension to discrete state synapses

    No full text
    Integrate-and-express models of synaptic plasticity propose that synapses may act as low-pass filters, integrating synaptic plasticity induction signals in order to discern trends before expressing synaptic plasticity. We have previously shown that synaptic filtering strongly controls destabilizing fluctuations in developmental models. When applied to palimpsest memory systems that learn new memories by forgetting old ones, we have also shown that with binary-strength synapses, integrative synapses lead to an initial memory signal rise before its fall back to equilibrium. Such an initial rise is in dramatic contrast to nonintegrative synapses, in which the memory signal falls monotonically. We now extend our earlier analysis of palimpsest memories with synaptic filters to consider the more general case of discrete state, multilevel synapses. We derive exact results for the memory signal dynamics and then consider various simplifying approximations. We show that multilevel synapses enhance the initial rise in the memory signal and then delay its subsequent fall by inducing a plateau-like region in the memory signal. Such dynamics significantly increase memory lifetimes, defined by a signal-to-noise ratio (SNR). We derive expressions for optimal choices of synaptic parameters (filter size, number of strength states, number of synapses) that maximize SNR memory lifetimes. However, we find that with memory lifetimes defined via mean-first-passage times, such optimality conditions do not exist, suggesting that optimality may be an artifact of SNRs

    Functional Implications of Synaptic Spike Timing Dependent Plasticity and Anti-Hebbian Membrane Potential Dependent Plasticity

    Get PDF
    A central hypothesis of neuroscience is that the change of the strength of synaptic connections between neurons is the basis for learning in the animal brain. However, the rules underlying the activity dependent change as well as their functional consequences are not well understood. This thesis develops and investigates several different quantitative models of synaptic plasticity. In the first part, the Contribution Dynamics model of Spike Timing Dependent Plasticity (STDP) is presented. It is shown to provide a better fit to experimental data than previous models. Additionally, investigation of the response properties of the model synapse to oscillatory neuronal activity shows that synapses are sensitive to theta oscillations (4-10 Hz), which are known to boost learning in behavioral experiments. In the second part, a novel Membrane Potential Dependent Plasticity (MPDP) rule is developed, which can be used to train neurons to fire precisely timed output activity. Previously, this could only be achieved with artificial supervised learning rules, whereas MPDP is a local activity dependent mechanism that is supported by experimental results
    • …
    corecore