465 research outputs found

    Beyond spike timing: the role of nonlinear plasticity and unreliable synapses

    Get PDF
    Spike-timing-dependent plasticity (STDP) strengthens synapses that are activated immediately before a postsynaptic spike, and weakens those that are activated after a spike. To prevent an uncontrolled growth of the synaptic strengths, weakening must dominate strengthening for uncorrelated spike times. However, this weight-normalization property would preclude Hebbian potentiation when the pre- and postsynaptic neurons are strongly active without specific spike-time correlations. We show that nonlinear STDP as inherent in the data of Markram et al. [(1997) Science 275:213–215] can preserve the benefits of both weight normalization and Hebbian plasticity, and hence can account for learning based on spike-time correlations and on mean firing rates. As examples we consider the moving-threshold property of the Bienenstock–Cooper–Munro rule, the development of direction-selective simple cells by changing short-term synaptic depression, and the joint adaptation of axonal and dendritic delays. Without threshold nonlinearity at low frequencies, the development of direction selectivity does not stabilize in a natural stimulation environment. Without synaptic unreliability there is no causal development of axonal and dendritic delays

    Neuromorphic Learning towards Nano Second Precision

    Full text link
    Temporal coding is one approach to representing information in spiking neural networks. An example of its application is the location of sounds by barn owls that requires especially precise temporal coding. Dependent upon the azimuthal angle, the arrival times of sound signals are shifted between both ears. In order to deter- mine these interaural time differences, the phase difference of the signals is measured. We implemented this biologically inspired network on a neuromorphic hardware system and demonstrate spike-timing dependent plasticity on an analog, highly accelerated hardware substrate. Our neuromorphic implementation enables the resolution of time differences of less than 50 ns. On-chip Hebbian learning mechanisms select inputs from a pool of neurons which code for the same sound frequency. Hence, noise caused by different synaptic delays across these inputs is reduced. Furthermore, learning compensates for variations on neuronal and synaptic parameters caused by device mismatch intrinsic to the neuromorphic substrate.Comment: 7 pages, 7 figures, presented at IJCNN 2013 in Dallas, TX, USA. IJCNN 2013. Corrected version with updated STDP curves IJCNN 201

    STDP in Recurrent Neuronal Networks

    Get PDF
    Recent results about spike-timing-dependent plasticity (STDP) in recurrently connected neurons are reviewed, with a focus on the relationship between the weight dynamics and the emergence of network structure. In particular, the evolution of synaptic weights in the two cases of incoming connections for a single neuron and recurrent connections are compared and contrasted. A theoretical framework is used that is based upon Poisson neurons with a temporally inhomogeneous firing rate and the asymptotic distribution of weights generated by the learning dynamics. Different network configurations examined in recent studies are discussed and an overview of the current understanding of STDP in recurrently connected neuronal networks is presented

    Phenomenological models of synaptic plasticity based on spike timing

    Get PDF
    Synaptic plasticity is considered to be the biological substrate of learning and memory. In this document we review phenomenological models of short-term and long-term synaptic plasticity, in particular spike-timing dependent plasticity (STDP). The aim of the document is to provide a framework for classifying and evaluating different models of plasticity. We focus on phenomenological synaptic models that are compatible with integrate-and-fire type neuron models where each neuron is described by a small number of variables. This implies that synaptic update rules for short-term or long-term plasticity can only depend on spike timing and, potentially, on membrane potential, as well as on the value of the synaptic weight, or on low-pass filtered (temporally averaged) versions of the above variables. We examine the ability of the models to account for experimental data and to fulfill expectations derived from theoretical considerations. We further discuss their relations to teacher-based rules (supervised learning) and reward-based rules (reinforcement learning). All models discussed in this paper are suitable for large-scale network simulation

    Phenomenological models of synaptic plasticity based on spike timing

    Get PDF
    Synaptic plasticity is considered to be the biological substrate of learning and memory. In this document we review phenomenological models of short-term and long-term synaptic plasticity, in particular spike-timing dependent plasticity (STDP). The aim of the document is to provide a framework for classifying and evaluating different models of plasticity. We focus on phenomenological synaptic models that are compatible with integrate-and-fire type neuron models where each neuron is described by a small number of variables. This implies that synaptic update rules for short-term or long-term plasticity can only depend on spike timing and, potentially, on membrane potential, as well as on the value of the synaptic weight, or on low-pass filtered (temporally averaged) versions of the above variables. We examine the ability of the models to account for experimental data and to fulfill expectations derived from theoretical considerations. We further discuss their relations to teacher-based rules (supervised learning) and reward-based rules (reinforcement learning). All models discussed in this paper are suitable for large-scale network simulations

    Single Biological Neurons as Temporally Precise Spatio-Temporal Pattern Recognizers

    Full text link
    This PhD thesis is focused on the central idea that single neurons in the brain should be regarded as temporally precise and highly complex spatio-temporal pattern recognizers. This is opposed to the prevalent view of biological neurons as simple and mainly spatial pattern recognizers by most neuroscientists today. In this thesis, I will attempt to demonstrate that this is an important distinction, predominantly because the above-mentioned computational properties of single neurons have far-reaching implications with respect to the various brain circuits that neurons compose, and on how information is encoded by neuronal activity in the brain. Namely, that these particular "low-level" details at the single neuron level have substantial system-wide ramifications. In the introduction we will highlight the main components that comprise a neural microcircuit that can perform useful computations and illustrate the inter-dependence of these components from a system perspective. In chapter 1 we discuss the great complexity of the spatio-temporal input-output relationship of cortical neurons that are the result of morphological structure and biophysical properties of the neuron. In chapter 2 we demonstrate that single neurons can generate temporally precise output patterns in response to specific spatio-temporal input patterns with a very simple biologically plausible learning rule. In chapter 3, we use the differentiable deep network analog of a realistic cortical neuron as a tool to approximate the gradient of the output of the neuron with respect to its input and use this capability in an attempt to teach the neuron to perform nonlinear XOR operation. In chapter 4 we expand chapter 3 to describe extension of our ideas to neuronal networks composed of many realistic biological spiking neurons that represent either small microcircuits or entire brain regions

    Motion Detection Using Spiking Neural Network Model

    Get PDF

    Computing with Synchrony

    Get PDF

    Encoding of Spatio-Temporal Input Characteristics by a CA1 Pyramidal Neuron Model

    Get PDF
    The in vivo activity of CA1 pyramidal neurons alternates between regular spiking and bursting, but how these changes affect information processing remains unclear. Using a detailed CA1 pyramidal neuron model, we investigate how timing and spatial arrangement variations in synaptic inputs to the distal and proximal dendritic layers influence the information content of model responses. We find that the temporal delay between activation of the two layers acts as a switch between excitability modes: short delays induce bursting while long delays decrease firing. For long delays, the average firing frequency of the model response discriminates spatially clustered from diffused inputs to the distal dendritic tree. For short delays, the onset latency and inter-spike-interval succession of model responses can accurately classify input signals as temporally close or distant and spatially clustered or diffused across different stimulation protocols. These findings suggest that a CA1 pyramidal neuron may be capable of encoding and transmitting presynaptic spatiotemporal information about the activity of the entorhinal cortex-hippocampal network to higher brain regions via the selective use of either a temporal or a rate code
    corecore