2,056 research outputs found

    Nonlinear Hebbian learning as a unifying principle in receptive field formation

    Get PDF
    The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common principle, namely Nonlinear Hebbian Learning. When Nonlinear Hebbian Learning is applied to natural images, receptive field shapes were strongly constrained by the input statistics and preprocessing, but exhibited only modest variation across different choices of nonlinearities in neuron models or synaptic plasticity rules. Neither overcompleteness nor sparse network activity are necessary for the development of localized receptive fields. The analysis of alternative sensory modalities such as auditory models or V2 development lead to the same conclusions. In all examples, receptive fields can be predicted a priori by reformulating an abstract model as nonlinear Hebbian learning. Thus nonlinear Hebbian learning and natural statistics can account for many aspects of receptive field formation across models and sensory modalities

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    Single Biological Neurons as Temporally Precise Spatio-Temporal Pattern Recognizers

    Full text link
    This PhD thesis is focused on the central idea that single neurons in the brain should be regarded as temporally precise and highly complex spatio-temporal pattern recognizers. This is opposed to the prevalent view of biological neurons as simple and mainly spatial pattern recognizers by most neuroscientists today. In this thesis, I will attempt to demonstrate that this is an important distinction, predominantly because the above-mentioned computational properties of single neurons have far-reaching implications with respect to the various brain circuits that neurons compose, and on how information is encoded by neuronal activity in the brain. Namely, that these particular "low-level" details at the single neuron level have substantial system-wide ramifications. In the introduction we will highlight the main components that comprise a neural microcircuit that can perform useful computations and illustrate the inter-dependence of these components from a system perspective. In chapter 1 we discuss the great complexity of the spatio-temporal input-output relationship of cortical neurons that are the result of morphological structure and biophysical properties of the neuron. In chapter 2 we demonstrate that single neurons can generate temporally precise output patterns in response to specific spatio-temporal input patterns with a very simple biologically plausible learning rule. In chapter 3, we use the differentiable deep network analog of a realistic cortical neuron as a tool to approximate the gradient of the output of the neuron with respect to its input and use this capability in an attempt to teach the neuron to perform nonlinear XOR operation. In chapter 4 we expand chapter 3 to describe extension of our ideas to neuronal networks composed of many realistic biological spiking neurons that represent either small microcircuits or entire brain regions

    Synaptic Plasticity and Hebbian Cell Assemblies

    Get PDF
    Synaptic dynamics are critical to the function of neuronal circuits on multiple timescales. In the first part of this dissertation, I tested the roles of action potential timing and NMDA receptor composition in long-term modifications to synaptic efficacy. In a computational model I showed that the dynamics of the postsynaptic [Ca2+] time course can be used to map the timing of pre- and postsynaptic action potentials onto experimentally observed changes in synaptic strength. Using dual patch-clamp recordings from cultured hippocampal neurons, I found that NMDAR subtypes can map combinations of pre- and postsynaptic action potentials onto either long-term potentiation (LTP) or depression (LTD). LTP and LTD could even be evoked by the same stimuli, and in such cases the plasticity outcome was determined by the availability of NMDAR subtypes. The expression of LTD was increasingly presynaptic as synaptic connections became more developed. Finally, I found that spike-timing-dependent potentiability is history-dependent, with a non-linear relationship to the number of pre- and postsynaptic action potentials. After LTP induction, subsequent potentiability recovered on a timescale of minutes, and was dependent on the duration of the previous induction. While activity-dependent plasticity is putatively involved in circuit development, I found that it was not required to produce small networks capable of exhibiting rhythmic persistent activity patterns called reverberations. However, positive synaptic scaling produced by network inactivity yielded increased quantal synaptic amplitudes, connectivity, and potentiability, all favoring reverberation. These data suggest that chronic inactivity upregulates synaptic efficacy by both quantal amplification and by the addition of silent synapses, the latter of which are rapidly activated by reverberation. Reverberation in previously inactivated networks also resulted in activity-dependent outbreaks of spontaneous network activity. Applying a model of short-term synaptic dynamics to the network level, I argue that these experimental observations can be explained by the interaction between presynaptic calcium dynamics and short-term synaptic depression on multiple timescales. Together, the experiments and modeling indicate that ongoing activity, synaptic scaling and metaplasticity are required to endow networks with a level of synaptic connectivity and potentiability that supports stimulus-evoked persistent activity patterns but avoids spontaneous activity

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and FundaciĂłn BBVA

    Induction and Maintenance of Synaptic Plasticity

    Get PDF
    Synaptic long-term modifications following neuronal activation are believed to be at the origin of learning and long-term memory. Recent experiments suggest that these long-term synaptic changes are all-or-none switch-like events between discrete states of a single synapse. The biochemical network involving calcium/calmodulin-dependent protein kinase II (CaMKII) and its regulating protein signaling cascade has been hypothesized to durably maintain the synaptic state in form of a bistable switch. Furthermore, it has been shown experimentally that CaMKII and associated proteins such as protein kinase A and calcineurin are necessary for the induction of long-lasting increases (long-term potentiation, LTP) and/or long-lasting decreases (long-term depression, LTD) of synaptic efficacy. However, the biochemical mechanisms by which experimental LTP/LTD protocols lead to corresponding transitions between the two states in realistic models of such networks are still unknown. We present a detailed biochemical model of the calcium/calmodulin-dependent autophosphorylation of CaMKII and the protein signaling cascade governing the dephosphorylation of CaMKII. As previously shown, two stable states of the CaMKII phosphorylation level exist at resting intracellular calcium concentrations. Repetitive high calcium levels switch the system from a weakly- to a highly phosphorylated state (LTP). We show that the reverse transition (LTD) can be mediated by elevated phosphatase activity at intermediate calcium levels. It is shown that the CaMKII kinase-phosphatase system can qualitatively reproduce plasticity results in response to spike-timing dependent plasticity (STDP) and presynaptic stimulation protocols. A reduced model based on the CaMKII system is used to elucidate which parameters control the synaptic plasticity outcomes in response to STDP protocols, and in particular how the plasticity results depend on the differential activation of phosphatase and kinase pathways and the level of noise in the calcium transients. Our results show that the protein network including CaMKII can account for (i) induction - through LTP/LTD-like transitions - and (ii) storage - due to its bistability - of synaptic changes. The model allows to link biochemical properties of the synapse with phenomenological 'learning rules' used by theoreticians in neural network studies
    • …
    corecore