255 research outputs found
Response variability in balanced cortical networks
We study the spike statistics of neurons in a network with dynamically
balanced excitation and inhibition. Our model, intended to represent a generic
cortical column, comprises randomly connected excitatory and inhibitory leaky
integrate-and-fire neurons, driven by excitatory input from an external
population. The high connectivity permits a mean-field description in which
synaptic currents can be treated as Gaussian noise, the mean and
autocorrelation function of which are calculated self-consistently from the
firing statistics of single model neurons. Within this description, we find
that the irregularity of spike trains is controlled mainly by the strength of
the synapses relative to the difference between the firing threshold and the
post-firing reset level of the membrane potential. For moderately strong
synapses we find spike statistics very similar to those observed in primary
visual cortex.Comment: 22 pages, 7 figures, submitted to Neural Computatio
Inhibitory synchrony as a mechanism for attentional gain modulation
Recordings from area V4 of monkeys have revealed that when the focus of
attention is on a visual stimulus within the receptive field of a cortical
neuron, two distinct changes can occur: The firing rate of the neuron can
change and there can be an increase in the coherence between spikes and the
local field potential in the gamma-frequency range (30-50 Hz). The hypothesis
explored here is that these observed effects of attention could be a
consequence of changes in the synchrony of local interneuron networks. We
performed computer simulations of a Hodgkin-Huxley type neuron driven by a
constant depolarizing current, I, representing visual stimulation and a
modulatory inhibitory input representing the effects of attention via local
interneuron networks. We observed that the neuron's firing rate and the
coherence of its output spike train with the synaptic inputs was modulated by
the degree of synchrony of the inhibitory inputs. The model suggest that the
observed changes in firing rate and coherence of neurons in the visual cortex
could be controlled by top-down inputs that regulated the coherence in the
activity of a local inhibitory network discharging at gamma frequencies.Comment: J.Physiology (Paris) in press, 11 figure
Stochasticity from function -- why the Bayesian brain may need no noise
An increasing body of evidence suggests that the trial-to-trial variability
of spiking activity in the brain is not mere noise, but rather the reflection
of a sampling-based encoding scheme for probabilistic computing. Since the
precise statistical properties of neural activity are important in this
context, many models assume an ad-hoc source of well-behaved, explicit noise,
either on the input or on the output side of single neuron dynamics, most often
assuming an independent Poisson process in either case. However, these
assumptions are somewhat problematic: neighboring neurons tend to share
receptive fields, rendering both their input and their output correlated; at
the same time, neurons are known to behave largely deterministically, as a
function of their membrane potential and conductance. We suggest that spiking
neural networks may, in fact, have no need for noise to perform sampling-based
Bayesian inference. We study analytically the effect of auto- and
cross-correlations in functionally Bayesian spiking networks and demonstrate
how their effect translates to synaptic interaction strengths, rendering them
controllable through synaptic plasticity. This allows even small ensembles of
interconnected deterministic spiking networks to simultaneously and
co-dependently shape their output activity through learning, enabling them to
perform complex Bayesian computation without any need for noise, which we
demonstrate in silico, both in classical simulation and in neuromorphic
emulation. These results close a gap between the abstract models and the
biology of functionally Bayesian spiking networks, effectively reducing the
architectural constraints imposed on physical neural substrates required to
perform probabilistic computing, be they biological or artificial
Stochastic resonance and finite resolution in a network of leaky integrate-and-fire neurons.
This thesis is a study of stochastic resonance (SR) in a discrete implementation of a leaky integrate-and-fire (LIF) neuron network. The aim was to determine if SR can be realised in limited precision discrete systems implemented on digital hardware.
How neuronal modelling connects with SR is discussed. Analysis techniques for noisy spike trains are described, ranging from rate coding, statistical measures, and signal processing measures like power spectrum and signal-to-noise ratio (SNR). The main problem in computing spike train power spectra is how to get equi-spaced sample amplitudes given the short duration of spikes relative to their frequency. Three different methods of computing the SNR of a spike train given its power spectrum are described. The main problem is how to separate the power at the frequencies of interest from the noise power as the spike train encodes both noise and the signal of interest.
Two models of the LIF neuron were developed, one continuous and one discrete, and the results compared. The discrete model allowed variation of the precision of the simulation values allowing investigation of the effect of precision limitation on SR. The main difference between the two models lies in the evolution of the membrane potential. When both models are allowed to decay from a high start value in the absence of input, the discrete model does not completely discharge while the continuous model discharges to almost zero.
The results of simulating the discrete model on an FPGA and the continuous model on a PC showed that SR can be realised in discrete low resolution digital systems. SR was found to be sensitive to the precision of the values in the simulations. For a single neuron, we find that SR increases between 10 bits and 12 bits resolution after which it saturates. For a feed-forward network with multiple input neurons and one output neuron, SR is stronger with more than 6 input neurons and it saturates at a higher resolution. We conclude that stochastic resonance can manifest in discrete systems though to a lesser extent compared to continuous systems
- …