97 research outputs found
Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance
Despite advances in experimental and theoretical neuroscience, we are still trying to identify key biophysical details that are important for characterizing the operation of brain circuits. Biological mechanisms at the level of single neurons and synapses can be combined as ‘building blocks’ to generate circuit function. We focus on the importance of capturing multiple timescales when describing these intrinsic and synaptic components. Whether inherent in the ionic currents, the neuron’s complex morphology, or the neurotransmitter composition of synapses, these multiple timescales prove crucial for capturing the variability and richness of circuit output and enhancing the information-carrying capacity observed across nervous systems
Ion channel degeneracy enables robust and tunable neuronal firing rates.
Firing rate is an important means of encoding information in the nervous system. To reliably encode a wide range of signals, neurons need to achieve a broad range of firing frequencies and to move smoothly between low and high firing rates. This can be achieved with specific ionic currents, such as A-type potassium currents, which can linearize the frequency-input current curve. By applying recently developed mathematical tools to a number of biophysical neuron models, we show how currents that are classically thought to permit low firing rates can paradoxically cause a jump to a high minimum firing rate when expressed at higher levels. Consequently, achieving and maintaining a low firing rate is surprisingly difficult and fragile in a biological context. This difficulty can be overcome via interactions between multiple currents, implying a need for ion channel degeneracy in the tuning of neuronal properties.This is the author accepted manuscript. The final version is available from National Academy of Sciences via http://dx.doi.org/10.1073/pnas.1516400112
Cellular switches orchestrate rhythmic circuits.
Small inhibitory neuronal circuits have long been identified as key neuronal motifs to generate and modulate the coexisting rhythms of various motor functions. Our paper highlights the role of a cellular switching mechanism to orchestrate such circuits. The cellular switch makes the circuits reconfigurable, robust, adaptable, and externally controllable. Without this cellular mechanism, the circuit rhythms entirely rely on specific tunings of the synaptic connectivity, which makes them rigid, fragile, and difficult to control externally. We illustrate those properties on the much studied architecture of a small network controlling both the pyloric and gastric rhythms of crabs. The cellular switch is provided by a slow negative conductance often neglected in mathematical modeling of central pattern generators. We propose that this conductance is simple to model and key to computational studies of rhythmic circuit neuromodulation
A bio-inspired bistable recurrent cell allows for long-lasting memory
Recurrent neural networks (RNNs) provide state-of-the-art performances in a
wide variety of tasks that require memory. These performances can often be
achieved thanks to gated recurrent cells such as gated recurrent units (GRU)
and long short-term memory (LSTM). Standard gated cells share a layer internal
state to store information at the network level, and long term memory is shaped
by network-wide recurrent connection weights. Biological neurons on the other
hand are capable of holding information at the cellular level for an arbitrary
long amount of time through a process called bistability. Through bistability,
cells can stabilize to different stable states depending on their own past
state and inputs, which permits the durable storing of past information in
neuron state. In this work, we take inspiration from biological neuron
bistability to embed RNNs with long-lasting memory at the cellular level. This
leads to the introduction of a new bistable biologically-inspired recurrent
cell that is shown to strongly improves RNN performance on time-series which
require very long memory, despite using only cellular connections (all
recurrent connections are from neurons to themselves, i.e. a neuron state is
not influenced by the state of other neurons). Furthermore, equipping this cell
with recurrent neuromodulation permits to link them to standard GRU cells,
taking a step towards the biological plausibility of GRU
Robust and tunable bursting requires slow positive feedback.
We highlight that the robustness and tunability of a bursting model critically rely on currents that provide slow positive feedback to the membrane potential. Such currents have the ability to make the total conductance of the circuit negative in a timescale that is termed "slow" because it is intermediate between the fast timescale of the spike upstroke and the ultraslow timescale of even slower adaptation currents. We discuss how such currents can be assessed either in voltage-clamp experiments or in computational models. We show that, while frequent in the literature, mathematical and computational models of bursting that lack the slow negative conductance are fragile and rigid. Our results suggest that modeling the slow negative conductance of cellular models is important when studying the neuromodulation of rhythmic circuits at any broader scale. NEW & NOTEWORTHY Nervous system functions rely on the modulation of neuronal activity between different rhythmic patterns. The mechanisms of this modulation are still poorly understood. Using computational modeling, we show the critical role of currents that provide slow negative conductance, distinct from the fast negative conductance necessary for spike generation. The significance of the slow negative conductance for neuromodulation is often overlooked, leading to computational models that are rigid and fragile.ER
Robust Modulation of Integrate-and-Fire Models.
By controlling the state of neuronal populations, neuromodulators ultimately affect behavior. A key neuromodulation mechanism is the alteration of neuronal excitability via the modulation of ion channel expression. This type of neuromodulation is normally studied with conductance-based models, but those models are computationally challenging for large-scale network simulations needed in population studies. This article studies the modulation properties of the multiquadratic integrate-and-fire model, a generalization of the classical quadratic integrate-and-fire model. The model is shown to combine the computational economy of integrate-and-fire modeling and the physiological interpretability of conductance-based modeling. It is therefore a good candidate for affordable computational studies of neuromodulation in large networks
Spike-based computation using classical recurrent neural networks
Spiking neural networks are a type of artificial neural networks in which
communication between neurons is only made of events, also called spikes. This
property allows neural networks to make asynchronous and sparse computations
and therefore to drastically decrease energy consumption when run on
specialized hardware. However, training such networks is known to be difficult,
mainly due to the non-differentiability of the spike activation, which prevents
the use of classical backpropagation. This is because state-of-the-art spiking
neural networks are usually derived from biologically-inspired neuron models,
to which are applied machine learning methods for training. Nowadays, research
about spiking neural networks focuses on the design of training algorithms
whose goal is to obtain networks that compete with their non-spiking version on
specific tasks. In this paper, we attempt the symmetrical approach: we modify
the dynamics of a well-known, easily trainable type of recurrent neural network
to make it event-based. This new RNN cell, called the Spiking Recurrent Cell,
therefore communicates using events, i.e. spikes, while being completely
differentiable. Vanilla backpropagation can thus be used to train any network
made of such RNN cell. We show that this new network can achieve performance
comparable to other types of spiking networks in the MNIST benchmark and its
variants, the Fashion-MNIST and the Neuromorphic-MNIST. Moreover, we show that
this new cell makes the training of deep spiking networks achievable.Comment: 12 pages, 3 figure
- …