23,431 research outputs found

    Synaptic Plasticity and Hebbian Cell Assemblies

    Get PDF
    Synaptic dynamics are critical to the function of neuronal circuits on multiple timescales. In the first part of this dissertation, I tested the roles of action potential timing and NMDA receptor composition in long-term modifications to synaptic efficacy. In a computational model I showed that the dynamics of the postsynaptic [Ca2+] time course can be used to map the timing of pre- and postsynaptic action potentials onto experimentally observed changes in synaptic strength. Using dual patch-clamp recordings from cultured hippocampal neurons, I found that NMDAR subtypes can map combinations of pre- and postsynaptic action potentials onto either long-term potentiation (LTP) or depression (LTD). LTP and LTD could even be evoked by the same stimuli, and in such cases the plasticity outcome was determined by the availability of NMDAR subtypes. The expression of LTD was increasingly presynaptic as synaptic connections became more developed. Finally, I found that spike-timing-dependent potentiability is history-dependent, with a non-linear relationship to the number of pre- and postsynaptic action potentials. After LTP induction, subsequent potentiability recovered on a timescale of minutes, and was dependent on the duration of the previous induction. While activity-dependent plasticity is putatively involved in circuit development, I found that it was not required to produce small networks capable of exhibiting rhythmic persistent activity patterns called reverberations. However, positive synaptic scaling produced by network inactivity yielded increased quantal synaptic amplitudes, connectivity, and potentiability, all favoring reverberation. These data suggest that chronic inactivity upregulates synaptic efficacy by both quantal amplification and by the addition of silent synapses, the latter of which are rapidly activated by reverberation. Reverberation in previously inactivated networks also resulted in activity-dependent outbreaks of spontaneous network activity. Applying a model of short-term synaptic dynamics to the network level, I argue that these experimental observations can be explained by the interaction between presynaptic calcium dynamics and short-term synaptic depression on multiple timescales. Together, the experiments and modeling indicate that ongoing activity, synaptic scaling and metaplasticity are required to endow networks with a level of synaptic connectivity and potentiability that supports stimulus-evoked persistent activity patterns but avoids spontaneous activity

    Neural networks with dynamical synapses: from mixed-mode oscillations and spindles to chaos

    Full text link
    Understanding of short-term synaptic depression (STSD) and other forms of synaptic plasticity is a topical problem in neuroscience. Here we study the role of STSD in the formation of complex patterns of brain rhythms. We use a cortical circuit model of neural networks composed of irregular spiking excitatory and inhibitory neurons having type 1 and 2 excitability and stochastic dynamics. In the model, neurons form a sparsely connected network and their spontaneous activity is driven by random spikes representing synaptic noise. Using simulations and analytical calculations, we found that if the STSD is absent, the neural network shows either asynchronous behavior or regular network oscillations depending on the noise level. In networks with STSD, changing parameters of synaptic plasticity and the noise level, we observed transitions to complex patters of collective activity: mixed-mode and spindle oscillations, bursts of collective activity, and chaotic behaviour. Interestingly, these patterns are stable in a certain range of the parameters and separated by critical boundaries. Thus, the parameters of synaptic plasticity can play a role of control parameters or switchers between different network states. However, changes of the parameters caused by a disease may lead to dramatic impairment of ongoing neural activity. We analyze the chaotic neural activity by use of the 0-1 test for chaos (Gottwald, G. & Melbourne, I., 2004) and show that it has a collective nature.Comment: 7 pages, Proceedings of 12th Granada Seminar, September 17-21, 201

    Short-Term Plasticity at the Schaffer Collateral: A New Model with Implications for Hippocampal Processing

    Get PDF
    A new mathematical model of short-term synaptic plasticity (STP) at the Schaffer collateral is introduced. Like other models of STP, the new model relates short-term synaptic plasticity to an interaction between facilitative and depressive dynamic influences. Unlike previous models, the new model successfully simulates facilitative and depressive dynamics within the framework of the synaptic vesicle cycle. The novelty of the model lies in the description of a competitive interaction between calcium-sensitive proteins for binding sites on the vesicle release machinery. By attributing specific molecular causes to observable presynaptic effects, the new model of STP can predict the effects of specific alterations to the presynaptic neurotransmitter release mechanism. This understanding will guide further experiments into presynaptic functionality, and may contribute insights into the development of pharmaceuticals that target illnesses manifesting aberrant synaptic dynamics, such as Fragile-X syndrome and schizophrenia. The new model of STP will also add realism to brain circuit models that simulate cognitive processes such as attention and memory. The hippocampal processing loop is an example of a brain circuit involved in memory formation. The hippocampus filters and organizes large amounts of spatio-temporal data in real time according to contextual significance. The role of synaptic dynamics in the hippocampal system is speculated to help keep the system close to a region of instability that increases encoding capacity and discriminating capability. In particular, synaptic dynamics at the Schaffer collateral are proposed to coordinate the output of the highly dynamic CA3 region of the hippocampus with the phase-code in the CA1 that modulates communication between the hippocampus and the neocortex

    A neuromorphic approach to auditory pattern recognition in cricket phonotaxis

    Get PDF
    Rost T, Ramachandran H, Nawrot MP, Chicca E. A neuromorphic approach to auditory pattern recognition in cricket phonotaxis. In: 2013 European Conference on Circuit Theory and Design (ECCTD). IEEE; 2013: 1-4.Developing neuromorphic computing paradigms that mimic nervous system function is an emerging field of research with high potential for technical applications. In the present study we take inspiration from the cricket auditory system and propose a biologically plausible neural network architecture that can explain how acoustic pattern recognition is achieved in the cricket central brain. Our circuit model combines two key features of neural processing dynamics: Spike Frequency Adaptation (SFA) and synaptic short term plasticity. We developed and extensively tested the model function in software simulations. Furthermore, the feasibility of an analogue VLSI implementation is demonstrated using a multi-neuron chip comprising Integrate-and-Fire (IF) neurons and adaptive synapses

    Emulating long-term synaptic dynamics with memristive devices

    Get PDF
    The potential of memristive devices is often seeing in implementing neuromorphic architectures for achieving brain-like computation. However, the designing procedures do not allow for extended manipulation of the material, unlike CMOS technology, the properties of the memristive material should be harnessed in the context of such computation, under the view that biological synapses are memristors. Here we demonstrate that single solid-state TiO2 memristors can exhibit associative plasticity phenomena observed in biological cortical synapses, and are captured by a phenomenological plasticity model called triplet rule. This rule comprises of a spike-timing dependent plasticity regime and a classical hebbian associative regime, and is compatible with a large amount of electrophysiology data. Via a set of experiments with our artificial, memristive, synapses we show that, contrary to conventional uses of solid-state memory, the co-existence of field- and thermally-driven switching mechanisms that could render bipolar and/or unipolar programming modes is a salient feature for capturing long-term potentiation and depression synaptic dynamics. We further demonstrate that the non-linear accumulating nature of memristors promotes long-term potentiating or depressing memory transitions

    Learning to Discriminate Through Long-Term Changes of Dynamical Synaptic Transmission

    Get PDF
    Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activitiesmay reflect decorrelation of the expanded stimulus space by intracortical synaptic connections

    Network Plasticity as Bayesian Inference

    Full text link
    General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference. But a model for that has been missing. We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by sampling from a posterior distribution of network configurations. This model provides a viable alternative to existing models that propose convergence of parameters to maximum likelihood values. It explains how priors on weight distributions and connection probabilities can be merged optimally with learned experience, how cortical networks can generalize learned information so well to novel experiences, and how they can compensate continuously for unforeseen disturbances of the network. The resulting new theory of network plasticity explains from a functional perspective a number of experimental data on stochastic aspects of synaptic plasticity that previously appeared to be quite puzzling.Comment: 33 pages, 5 figures, the supplement is available on the author's web page http://www.igi.tugraz.at/kappe

    Memory and information processing in neuromorphic systems

    Full text link
    A striking difference between brain-inspired neuromorphic processors and current von Neumann processors architectures is the way in which memory and processing is organized. As Information and Communication Technologies continue to address the need for increased computational power through the increase of cores within a digital processor, neuromorphic engineers and scientists can complement this need by building processor architectures where memory is distributed with the processing. In this paper we present a survey of brain-inspired processor architectures that support models of cortical networks and deep neural networks. These architectures range from serial clocked implementations of multi-neuron systems to massively parallel asynchronous ones and from purely digital systems to mixed analog/digital systems which implement more biological-like models of neurons and synapses together with a suite of adaptation and learning mechanisms analogous to the ones found in biological nervous systems. We describe the advantages of the different approaches being pursued and present the challenges that need to be addressed for building artificial neural processing systems that can display the richness of behaviors seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed neuromorphic computing platforms and system
    • ā€¦
    corecore