201 research outputs found
Tuning a binary ferromagnet into a multi-state synapse with spin-orbit torque induced plasticity
Inspired by ion-dominated synaptic plasticity in human brain, artificial
synapses for neuromorphic computing adopt charge-related quantities as their
weights. Despite the existing charge derived synaptic emulations, schemes of
controlling electron spins in ferromagnetic devices have also attracted
considerable interest due to their advantages of low energy consumption,
unlimited endurance, and favorable CMOS compatibility. However, a generally
applicable method of tuning a binary ferromagnet into a multi-state memory with
pure spin-dominated synaptic plasticity in the absence of an external magnetic
field is still missing. Here, we show how synaptic plasticity of a
perpendicular ferromagnetic FM1 layer can be obtained when it is
interlayer-exchange-coupled by another in-plane ferromagnetic FM2 layer, where
a magnetic-field-free current-driven multi-state magnetization switching of FM1
in the Pt/FM1/Ta/FM2 structure is induced by spin-orbit torque. We use current
pulses to set the perpendicular magnetization state which acts as the synapse
weight, and demonstrate spintronic implementation of the excitatory/inhibitory
postsynaptic potentials and spike timing-dependent plasticity. This
functionality is made possible by the action of the in-plane interlayer
exchange coupling field which leads to broadened, multi-state magnetic reversal
characteristics. Numerical simulations, combined with investigations of a
reference sample with a single perpendicular magnetized Pt/FM1/Ta structure,
reveal that the broadening is due to the in-plane field component tuning the
efficiency of the spin-orbit-torque to drive domain walls across a landscape of
varying pinning potentials. The conventionally binary FM1 inside our
Pt/FM1/Ta/FM2 structure with inherent in-plane coupling field is therefore
tuned into a multi-state perpendicular ferromagnet and represents a synaptic
emulator for neuromorphic computing.Comment: 37 pages with 11 figures, including 20 pages for manuscript and 17
pages for supplementary informatio
Balancing Feed-Forward Excitation and Inhibition via Hebbian Inhibitory Synaptic Plasticity
It has been suggested that excitatory and inhibitory inputs to cortical cells are balanced, and that this balance is important for the highly irregular firing observed in the cortex. There are two hypotheses as to the origin of this balance. One assumes that it results from a stable solution of the recurrent neuronal dynamics. This model can account for a balance of steady state excitation and inhibition without fine tuning of parameters, but not for transient inputs. The second hypothesis suggests that the feed forward excitatory and inhibitory inputs to a postsynaptic cell are already balanced. This latter hypothesis thus does account for the balance of transient inputs. However, it remains unclear what mechanism underlies the fine tuning required for balancing feed forward excitatory and inhibitory inputs. Here we investigated whether inhibitory synaptic plasticity is responsible for the balance of transient feed forward excitation and inhibition. We address this issue in the framework of a model characterizing the stochastic dynamics of temporally anti-symmetric Hebbian spike timing dependent plasticity of feed forward excitatory and inhibitory synaptic inputs to a single post-synaptic cell. Our analysis shows that inhibitory Hebbian plasticity generates ‘negative feedback’ that balances excitation and inhibition, which contrasts with the ‘positive feedback’ of excitatory Hebbian synaptic plasticity. As a result, this balance may increase the sensitivity of the learning dynamics to the correlation structure of the excitatory inputs
Regulation of Irregular Neuronal Firing by Autaptic Transmission
The importance of self-feedback autaptic transmission in modulating
spike-time irregularity is still poorly understood. By using a biophysical
model that incorporates autaptic coupling, we here show that self-innervation
of neurons participates in the modulation of irregular neuronal firing,
primarily by regulating the occurrence frequency of burst firing. In
particular, we find that both excitatory and electrical autapses increase the
occurrence of burst firing, thus reducing neuronal firing regularity. In
contrast, inhibitory autapses suppress burst firing and therefore tend to
improve the regularity of neuronal firing. Importantly, we show that these
findings are independent of the firing properties of individual neurons, and as
such can be observed for neurons operating in different modes. Our results
provide an insightful mechanistic understanding of how different types of
autapses shape irregular firing at the single-neuron level, and they highlight
the functional importance of autaptic self-innervation in taming and modulating
neurodynamics.Comment: 27 pages, 8 figure
Homeostatic plasticity and external input shape neural network dynamics
In vitro and in vivo spiking activity clearly differ. Whereas networks in
vitro develop strong bursts separated by periods of very little spiking
activity, in vivo cortical networks show continuous activity. This is puzzling
considering that both networks presumably share similar single-neuron dynamics
and plasticity rules. We propose that the defining difference between in vitro
and in vivo dynamics is the strength of external input. In vitro, networks are
virtually isolated, whereas in vivo every brain area receives continuous input.
We analyze a model of spiking neurons in which the input strength, mediated by
spike rate homeostasis, determines the characteristics of the dynamical state.
In more detail, our analytical and numerical results on various network
topologies show consistently that under increasing input, homeostatic
plasticity generates distinct dynamic states, from bursting, to
close-to-critical, reverberating and irregular states. This implies that the
dynamic state of a neural network is not fixed but can readily adapt to the
input strengths. Indeed, our results match experimental spike recordings in
vitro and in vivo: the in vitro bursting behavior is consistent with a state
generated by very low network input (< 0.1%), whereas in vivo activity suggests
that on the order of 1% recorded spikes are input-driven, resulting in
reverberating dynamics. Importantly, this predicts that one can abolish the
ubiquitous bursts of in vitro preparations, and instead impose dynamics
comparable to in vivo activity by exposing the system to weak long-term
stimulation, thereby opening new paths to establish an in vivo-like assay in
vitro for basic as well as neurological studies.Comment: 14 pages, 8 figures, accepted at Phys. Rev.
Branch-specific plasticity enables self-organization of nonlinear computation in single neurons
It has been conjectured that nonlinear processing in dendritic branches endows individual neurons with the capability to perform complex computational operations that are needed in order to solve for example the binding problem. However, it is not clear how single neurons could acquire such functionality in a self-organized manner, since most theoretical studies of synaptic plasticity and learning concentrate on neuron models without nonlinear dendritic properties. In the meantime, a complex picture of information processing with dendritic spikes and a variety of plasticity mechanisms in single neurons has emerged from experiments. In particular, new experimental data on dendritic branch strength potentiation in rat hippocampus have not yet been incorporated into such models. In this article, we investigate how experimentally observed plasticity mechanisms, such as depolarization-dependent STDP and branch-strength potentiation could be integrated to self-organize nonlinear neural computations with dendritic spikes. We provide a mathematical proof that in a simplified setup these plasticity mechanisms induce a competition between dendritic branches, a novel concept in the analysis of single neuron adaptivity. We show via computer simulations that such dendritic competition enables a single neuron to become member of several neuronal ensembles, and to acquire nonlinear computational capabilities, such as for example the capability to bind multiple input features. Hence our results suggest that nonlinear neural computation may self-organize in single neurons through the interaction of local synaptic and dendritic plasticity mechanisms
Neutral theory and scale-free neural dynamics
Avalanches of electrochemical activity in brain networks have been
empirically reported to obey scale-invariant behavior --characterized by
power-law distributions up to some upper cut-off-- both in vitro and in vivo.
Elucidating whether such scaling laws stem from the underlying neural dynamics
operating at the edge of a phase transition is a fascinating possibility, as
systems poised at criticality have been argued to exhibit a number of important
functional advantages. Here we employ a well-known model for neural dynamics
with synaptic plasticity, to elucidate an alternative scenario in which
neuronal avalanches can coexist, overlapping in time, but still remaining
scale-free. Remarkably their scale-invariance does not stem from underlying
criticality nor self-organization at the edge of a continuous phase transition.
Instead, it emerges from the fact that perturbations to the system exhibit a
neutral drift --guided by demographic fluctuations-- with respect to endogenous
spontaneous activity. Such a neutral dynamics --similar to the one in neutral
theories of population genetics-- implies marginal propagation of activity,
characterized by power-law distributed causal avalanches. Importantly, our
results underline the importance of considering causal information --on which
neuron triggers the firing of which-- to properly estimate the statistics of
avalanches of neural activity. We discuss the implications of these findings
both in modeling and to elucidate experimental observations, as well as its
possible consequences for actual neural dynamics and information processing in
actual neural networks.Comment: Main text: 8 pages, 3 figures. Supplementary information: 5 pages, 4
figure
- …