249 research outputs found
Metastable states and quasicycles in a stochastic Wilson-Cowan\ud model of neuronal population dynamics
We analyze a stochastic model of neuronal population dynamics with intrinsic noise. In the thermodynamic limit N -> infinity, where N determines the size of each population, the dynamics is described by deterministic WilsonâCowan equations. On the other hand, for finite N the dynamics is described by a master equation that determines the probability of spiking activity within each population. We first consider a single excitatory population that exhibits bistability in the deterministic limit. The steadyâstate probability distribution of the stochastic network has maxima at points corresponding to the stable fixed points of the deterministic network; the relative weighting of the two maxima depends on the system size. For large but finite N, we calculate the exponentially small rate of noiseâinduced transitions between the resulting metastable states using a WentzelâKramersâBrillouin (WKB) approximation and matched asymptotic expansions. We then consider a two-population excitatory/inhibitory network that supports limit cycle oscillations. Using a diffusion approximation, we reduce the dynamics to a neural Langevin equation, and show how the intrinsic noise amplifies subthreshold oscillations (quasicycles)
Stochastic neural field theory and the system-size expansion
We analyze a master equation formulation of stochastic neurodynamics for a network of synaptically coupled homogeneous neuronal populations each consisting of N identical neurons. The state of the network is specified by the fraction of active or spiking neurons in each population, and transition rates are chosen so that in the thermodynamic or deterministic limit (N â â) we recover standard activityâbased or voltageâbased rate models. We derive the lowest order corrections to these rate equations for large but finite N using two different approximation schemes, one based on the Van Kampen system-size expansion and the other based on path integral methods. Both methods yield the same series expansion of the moment equations, which at O(1/N ) can be truncated to form a closed system of equations for the first and second order moments. Taking a continuum limit of the moment equations whilst keeping the system size N fixed generates a system of integrodifferential equations for the mean and covariance of the corresponding stochastic neural field model. We also show how the path integral approach can be used to study large deviation or rare event statistics underlying escape from the basin of attraction of a stable fixed point of the meanâfield dynamics; such an analysis is not possible using the system-size expansion since the latter cannot accurately\ud
determine exponentially small transitions
Front propagation in stochastic neural fields
We analyse the effects of extrinsic multiplicative noise on front propagation in a scalar neural field with excitatory connections. Using a separation of time scales, we represent the fluctuating front in terms of a diffusiveâlike displacement (wandering) of the front from its uniformly translating position at long time scales, and fluctuations in the front profile around its instantaneous position at short time scales. One major result of our analysis is a comparison between freely propagating fronts and fronts locked to an externally moving stimulus. We show that the latter are much more robust to noise, since the stochastic wandering of the mean front profile is described by an OrnsteinâUhlenbeck process rather than a Wiener process, so that the variance in front position saturates in the long time limit rather than increasing linearly with time. Finally, we consider a stochastic neural field that supports a pulled front in the deterministic limit, and show that the wandering of such a front is now subdiffusive
Stationary bumps in a piecewise smooth neural field model with synaptic depression
We analyze the existence and stability of stationary pulses or bumps in a oneâdimensional piecewise smooth neural field model with synaptic depression. The continuum dynamics is described in terms of a nonlocal integrodifferential equation, in which the integral kernel represents the spatial distribution of synaptic weights between populations of neurons whose mean firing rate is taken to be a Heaviside function of local activity. Synaptic depression dynamically reduces the strength of synaptic weights in response to increases in activity. We show that in the case of a Mexican hat weight distribution, there exists a stable bump for sufficiently weak synaptic depression. However, as synaptic depression becomes stronger, the bump became unstable with respect to perturbations that shift the boundary of the bump, leading to the formation of a traveling pulse. The local stability of a bump is determined by the spectrum of a piecewise linear operator that keeps track of the sign of perturbations of the bump boundary. This results in a number of differences from previous studies of neural field models with Heaviside firing rate functions, where any discontinuities appear inside convolutions so that the resulting dynamical system is smooth. We also extend our results to the case of radially symmetric bumps in twoâdimensional neural field models
Spatially structured oscillations in a two-dimensional excitatory neuronal network with synaptic depression
We study the spatiotemporal dynamics of a two-dimensional excitatory neuronal network with synaptic depression. Coupling between populations of neurons is taken to be nonlocal, while depression is taken to be local and presynaptic. We show that the network supports a wide range of spatially structured oscillations, which are suggestive of phenomena seen in cortical slice experiments and in vivo. The particular form of the oscillations depends on initial conditions and the level of background noise. Given an initial, spatially localized stimulus, activity evolves to a spatially localized oscillating core that periodically emits target waves. Low levels of noise can spontaneously generate several pockets of oscillatory activity that interact via their target patterns. Periodic activity in space can also organize into spiral waves, provided that there is some source of rotational symmetry breaking due to external stimuli or noise. In the high gain limit, no oscillatory behavior exists, but a transient stimulus can lead to a single, outward propagating target wave
Neural field model of binocular rivalry waves
We present a neural field model of binocular rivalry waves in visual cortex. For each eye we consider a oneâdimensional network of neurons that respond maximally to a particular feature of the corresponding image such as the orientation of a grating stimulus. Recurrent connections within each one-dimensional network are assumed to be excitatory, whereas connections between the two networks are inhibitory (cross-inhibition). Slow adaptation is incorporated into the model by taking the network connections to exhibit synaptic depression. We derive an analytical expression for the speed of a binocular rivalry wave as a function of various neurophysiological parameters, and show how properties of the wave are consistent with the waveâlike propagation of perceptual dominance observed in recent psychophysical experiments. In addition to providing an analytical framework for studying binocular rivalry waves, we show how neural field methods provide insights into the mechanisms underlying the generation of the waves. In particular, we highlight the important role of slow adaptation in providing a âsymmetry breaking mechanismâ that allows waves to propagate
The effects of noise on binocular rivalry waves: a stochastic neural field model
We analyse the effects of extrinsic noise on traveling waves of visual perception in a competitive neural field model of binocular rivalry. The model consists of two one-dimensional excitatory neural fields, whose activity variables represent the responses to left-eye and right-eye stimuli, respectively. The two networks mutually inhibit each other, and slow adaptation is incorporated into the model by taking the network connections to exhibit synaptic depression. We first show how, in the absence of any noise, the system supports a propagating composite wave consisting of an invading activity front in one network co-moving with a retreating front in the other network. Using a separation of time scales and perturbation methods previously developed for stochastic reaction-diffusion equations, we then show how multiplicative noise in the activity variables leads to a diffusiveâlike displacement (wandering) of the composite wave from its uniformly translating position at long time scales, and fluctuations in the wave profile around its instantaneous position at short time scales. The multiplicative noise also renormalizes the mean speed of the wave. We use our analysis to calculate the first passage time distribution for a stochastic rivalry wave to travel a fixed distance, which we find to be given by an inverse Gaussian. Finally, we investigate the effects of noise in the depression variables, which under an adiabatic approximation leads to quenched disorder in the neural fields during propagation of a wave
Random intermittent search and the tug-of-war model of motor-driven transport
We formulate the tug-of-war model of microtubule cargo transport by multiple molecular motors as an intermittent random search for a hidden target. A motor-complex consisting of multiple molecular motors with opposing directional preference is modeled using a discrete Markov process. The motors randomly pull each other off of the microtubule so that the state of the motor-complex is determined by the number of bound motors. The tug-of-war model prescribes the state transition rates and corresponding cargo velocities in terms of experimentally measured physical parameters. We add space to the resulting Chapman-Kolmogorov (CK) equation so that we can consider delivery of the cargo to a hidden target somewhere on the microtubule track. Using a quasi-steady state (QSS) reduction technique we calculate analytical approximations of the mean first passage time (MFPT) to find the target. We show that there exists an optimal adenosine triphosphate (ATP)concentration that minimizes the MFPT for two different cases: (i) the motor complex is composed of equal numbers of kinesin motors bound to two different microtubules (symmetric tug-of-war model), and (ii) the motor complex is composed of different numbers of kinesin and dynein motors bound to a single microtubule(asymmetric tug-of-war model)
A theory for the alignment of cortical feature maps during\ud development
We present a developmental model of ocular dominance column formation that takes into account the existence of an array of intrinsically specified cytochrome oxidase blobs. We assume that there is some molecular substrate for the blobs early in development, which generates a spatially periodic modulation of experienceâdependent plasticity. We determine the effects of such a modulation on a competitive Hebbian mechanism for the modification of the feedforward afferents from the left and right eyes. We show how alternating left and right eye dominated columns can develop, in which the blobs are aligned with the centers of the ocular dominance columns and receive a greater density of feedforward connections, thus becoming defined extrinsically. More generally, our results suggest that the presence of periodically distributed anatomical markers early in development could provide a mechanism for the alignment of cortical feature maps
Quasi-steady state reduction of molecular motor-based models of directed intermittent search
We present a quasiâsteady state reduction of a linear reactionâhyperbolic master equation describing the directed intermittent search for a hidden target by a motorâdriven particle moving on a oneâdimensional filament track. The particle is injected at one end of the track and randomly switches between stationary search phases and mobile, non-search phases that are biased in the anterograde direction. There is a finite possibility that the particle fails to find the target due to an absorbing boundary at the other end of the track. Such a scenario is exemplified by the motorâdriven transport of vesicular cargo to synaptic targets located on the axon or dendrites of a neuron. The reduced model is described by a scalar FokkerâPlanck (FP) equation, which has an additional inhomogeneous decay term that takes into account absorption by the target. The FP equation is used to compute the probability of finding the hidden target (hitting probability) and the corresponding conditional mean first passage time (MFPT) in terms of the effective drift velocity V , diffusivity D and target absorption rate λ of the random search. The quasiâsteady state reduction determines V, D and λ in terms of the various biophysical parameters of the underlying motor transport model. We first apply our analysis to a simple 3âstate model and show that our quasiâsteady state reduction yields results that are in excellent agreement with Monte Carlo simulations of the full system under physiologically reasonable conditions. We then consider a more complex multiple motor model of bidirectional transport, in which opposing motors compete in a âtug-of-war,â and use this to explore how ATP concentration might regulate the delivery of cargo to synaptic targets
- âŠ