143 research outputs found
A bifurcation analysis of a modified neural field model: conductance-based synapses act as an anti-epileptic regulatory mechanism
Eigenvalue spectral properties of sparse random matrices obeying Dale's law
Understanding the dynamics of large networks of neurons with heterogeneous
connectivity architectures is a complex physics problem that demands novel
mathematical techniques. Biological neural networks are inherently spatially
heterogeneous, making them difficult to mathematically model. Random recurrent
neural networks capture complex network connectivity structures and enable
mathematically tractability. Our paper generalises previous classical results
to sparse connectivity matrices which have distinct excitatory (E) or
inhibitory (I) neural populations. By investigating sparse networks we
construct our analysis to examine the impacts of all levels of network
sparseness, and discover a novel nonlinear interaction between the connectivity
matrix and resulting network dynamics, in both the balanced and unbalanced
cases. Specifically, we deduce new mathematical dependencies describing the
influence of sparsity and distinct E/I distributions on the distribution of
eigenvalues (eigenspectrum) of the networked Jacobian. Furthermore, we
illustrate that the previous classical results are special cases of the more
general results we have described here. Understanding the impacts of sparse
connectivities on network dynamics is of particular importance for both
theoretical neuroscience and mathematical physics as it pertains to the
structure-function relationship of networked systems and their dynamics. Our
results are an important step towards developing analysis techniques that are
essential to studying the impacts of larger scale network connectivity on
network function, and furthering our understanding of brain function and
dysfunction.Comment: 18 pages, 6 figure
Soft-bound synaptic plasticity increases storage capacity
Accurate models of synaptic plasticity are essential to understand the adaptive properties of the nervous system and for realistic models of learning and memory. Experiments have shown that synaptic plasticity depends not only on pre- and post-synaptic activity patterns, but also on the strength of the connection itself. Namely, weaker synapses are more easily strengthened than already strong ones. This so called soft-bound plasticity automatically constrains the synaptic strengths. It is known that this has important consequences for the dynamics of plasticity and the synaptic weight distribution, but its impact on information storage is unknown. In this modeling study we introduce an information theoretic framework to analyse memory storage in an online learning setting. We show that soft-bound plasticity increases a variety of performance criteria by about 18% over hard-bound plasticity, and likely maximizes the storage capacity of synapses
The representation of input correlation structure from multiple pools in the synaptic weights by STDP
Computational modeling with spiking neural networks
This chapter reviews recent developments in the area of spiking neural networks (SNN) and summarizes the main contributions to this research field. We give background information about the functioning of biological neurons, discuss the most important mathematical neural models along with neural encoding techniques, learning algorithms, and applications of spiking neurons. As a specific application, the functioning of the evolving spiking neural network (eSNN) classification method is presented in detail and the principles of numerous eSNN based applications are highlighted and discussed
Homeostatic Scaling of Excitability in Recurrent Neural Networks
Neurons adjust their intrinsic excitability when experiencing a persistent change in synaptic drive. This process can prevent neural activity from moving into either a quiescent state or a saturated state in the face of ongoing plasticity, and is thought to promote stability of the network in which neurons reside. However, most neurons are embedded in recurrent networks, which require a delicate balance between excitation and inhibition to maintain network stability. This balance could be disrupted when neurons independently adjust their intrinsic excitability. Here, we study the functioning of activity-dependent homeostatic scaling of intrinsic excitability (HSE) in a recurrent neural network. Using both simulations of a recurrent network consisting of excitatory and inhibitory neurons that implement HSE, and a mean-field description of adapting excitatory and inhibitory populations, we show that the stability of such adapting networks critically depends on the relationship between the adaptation time scales of both neuron populations. In a stable adapting network, HSE can keep all neurons functioning within their dynamic range, while the network is undergoing several (patho)physiologically relevant types of plasticity, such as persistent changes in external drive, changes in connection strengths, or the loss of inhibitory cells from the network. However, HSE cannot prevent the unstable network dynamics that result when, due to such plasticity, recurrent excitation in the network becomes too strong compared to feedback inhibition. This suggests that keeping a neural network in a stable and functional state requires the coordination of distinct homeostatic mechanisms that operate not only by adjusting neural excitability, but also by controlling network connectivity
STDP Allows Fast Rate-Modulated Coding with Poisson-Like Spike Trains
Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (∼10–20 ms) for sufficiently many inputs (∼100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks
- …
