43 research outputs found

    Nonlocal Ginzburg-Landau equation for cortical pattern formation

    Get PDF
    We show how a nonlocal version of the real Ginzburg-Landau (GL) equation arises in a large-scale recurrent network model of primary visual cortex. We treat cortex as a continuous two-dimensional sheet of cells that signal both the position and orientation of a local visual stimulus. The recurrent circuitry is decomposed into a local part, which contributes primarily to the orientation tuning properties of the cells, and a long-range part that introduces spatial correlations. We assume that (a) the local network exists in a balanced state such that it operates close to a point of instability and (b) the long-range connections are weak and scale with the bifurcation parameter of the dynamical instability generated by the local circuitry. Carrying out a perturbation expansion with respect to the long-range coupling strength then generates a nonlocal coupling term in the GL amplitude equation. We use the nonlocal GL equation to analyze how axonal propagation delays arising from the slow conduction velocities of the long-range connections affect spontaneous pattern formation

    Two-dimensional bumps in piecewise smooth neural fields with synaptic depression

    Get PDF
    We analyze radially symmetric bumps in a two-dimensional piecewise-smooth neural field model with synaptic depression. The continuum dynamics is described in terms of a nonlocal integrodifferential equation, in which the integral kernel represents the spatial distribution of synaptic weights between populations of neurons whose mean firing rate is taken to be a Heaviside function of local activity. Synaptic depression dynamically reduces the strength of synaptic weights in response to increases in activity. We show that in the case of a Mexican hat weight distribution, sufficiently strong synaptic depression can destabilize a stationary bump solution that would be stable in the absence of depression. Numerically it is found that the resulting instability leads to the formation of a traveling spot. The local stability of a bump is determined by solutions to a system of pseudolinear equations that take into account the sign of perturbations around the circular bump boundary

    Effects of synaptic depression and adaptation on spatiotemporal dynamics of an excitatory neuronal network

    Get PDF
    We analyze the spatiotemporal dynamics of a system of integro-differential equations that describes a one-dimensional excitatory neuronal network with synaptic depression and spike frequency adaptation. Physiologically suggestive forms are used for both types of negative feedback. We also consider the effects of employing two different types of firing rate function, a Heaviside step function and a piecewise linear function. We first derive conditions for the existence of traveling fronts and pulses in the case of a Heaviside step firing rate, and show that adaptation plays a relatively minor role in determining the characteristics of traveling waves. We then derive conditions for the existence and stability of stationary pulses or bumps, and show that a purely excitatory network with synaptic depression cannot support stable bumps. However, bumps do not exist in the presence of adaptation. Finally, in the case of a piecewise linear firing rate function, we show numerically that the network also supports self-sustained oscillations between an Up state and a Down state, in which a spatially localized oscillating core periodically emits pulses at each cycle

    Spatially structured oscillations in a two-dimensional excitatory neuronal network with synaptic depression

    Get PDF
    We study the spatiotemporal dynamics of a two-dimensional excitatory neuronal network with synaptic depression. Coupling between populations of neurons is taken to be nonlocal, while depression is taken to be local and presynaptic. We show that the network supports a wide range of spatially structured oscillations, which are suggestive of phenomena seen in cortical slice experiments and in vivo. The particular form of the oscillations depends on initial conditions and the level of background noise. Given an initial, spatially localized stimulus, activity evolves to a spatially localized oscillating core that periodically emits target waves. Low levels of noise can spontaneously generate several pockets of oscillatory activity that interact via their target patterns. Periodic activity in space can also organize into spiral waves, provided that there is some source of rotational symmetry breaking due to external stimuli or noise. In the high gain limit, no oscillatory behavior exists, but a transient stimulus can lead to a single, outward propagating target wave

    Traveling pulses and wave propagation failure in inhomogeneous neural media

    Get PDF
    We use averaging and homogenization theory to study the propagation of traveling pulses in an inhomogeneous excitable neural network. The network is modeled in terms of a nonlocal integro-differential equation, in which the integral kernel represents the spatial distribution of synaptic weights. We show how a spatially periodic modulation of homogeneous synaptic connections leads to an effective reduction in the speed of a traveling pulse. In the case of large amplitude modulations, the traveling pulse represents the envelope of a multibump solution, in which individual bumps are nonpropagating and transient. The appearance (disappearance) of bumps at the leading (trailing) edge of the pulse generates the coherent propagation of the pulse. Wave propagation failure occurs when activity is insufficient to maintain bumps at the leading edge

    Neural field models with threshold noise

    Get PDF
    The original neural field model of Wilson and Cowan is often interpreted as the averaged behaviour of a network of switch like neural elements with a distribution of switch thresholds, giving rise to the classic sigmoidal population firing-rate function so prevalent in large scale neuronal modelling. In this paper we explore the effects of such threshold noise without recourse to averaging and show that spatial correlations can have a strong effect on the behaviour of waves and patterns in continuum models. Moreover, for a prescribed spatial covariance function we explore the differences in behaviour that can emerge when the underlying stationary distribution is changed from Gaussian to non-Gaussian. For travelling front solutions, in a system with exponentially decaying spatial interactions, we make use of an interface approach to calculate the instantaneous wave speed analytically as a series expansion in the noise strength. From this we find that, for weak noise, the spatially averaged speed depends only on the choice of covariance function and not on the shape of the stationary distribution. For a system with a Mexican-hat spatial connectivity we further find that noise can induce localised bump solutions, and using an interface stability argument show that there can be multiple stable solution branches

    Macroscopic coherent structures in a stochastic neural network: from interface dynamics to coarse-grained bifurcation analysis

    Get PDF
    We study coarse pattern formation in a cellular automaton modelling a spatially-extended stochastic neural network. The model, originally proposed by Gong and Robinson (Phys Rev E 85(5):055,101(R), 2012), is known to support stationary and travelling bumps of localised activity. We pose the model on a ring and study the existence and stability of these patterns in various limits using a combination of analytical and numerical techniques. In a purely deterministic version of the model, posed on a continuum, we construct bumps and travelling waves analytically using standard interface methods from neural field theory. In a stochastic version with Heaviside firing rate, we construct approximate analytical probability mass functions associated with bumps and travelling waves. In the full stochastic model posed on a discrete lattice, where a coarse analytic description is unavailable, we compute patterns and their linear stability using equation-free methods. The lifting procedure used in the coarse time-stepper is informed by the analysis in the deterministic and stochastic limits. In all settings, we identify the synaptic profile as a mesoscopic variable, and the width of the corresponding activity set as a macroscopic variable. Stationary and travelling bumps have similar meso- and macroscopic profiles, but different microscopic structure, hence we propose lifting operators which use microscopic motifs to disambiguate them. We provide numerical evidence that waves are supported by a combination of high synaptic gain and long refractory times, while meandering bumps are elicited by short refractory times

    Dynamics of temporally interleaved percept-choice sequences: interaction via adaptation in shared neural populations

    Get PDF
    At the onset of visually ambiguous or conflicting stimuli, our visual system quickly ‘chooses’ one of the possible percepts. Interrupted presentation of the same stimuli has revealed that each percept-choice depends strongly on the history of previous choices and the duration of the interruptions. Recent psychophysics and modeling has discovered increasingly rich dynamical structure in such percept-choice sequences, and explained or predicted these patterns in terms of simple neural mechanisms: fast cross-inhibition and slow shunting adaptation that also causes a near-threshold facilitatory effect. However, we still lack a clear understanding of the dynamical interactions between two distinct, temporally interleaved, percept-choice sequences—a type of experiment that probes which feature-level neural network connectivity and dynamics allow the visual system to resolve the vast ambiguity of everyday vision. Here, we fill this gap. We first show that a simple column-structured neural network captures the known phenomenology, and then identify and analyze the crucial underlying mechanism via two stages of model-reduction: A 6-population reduction shows how temporally well-separated sequences become coupled via adaptation in neurons that are shared between the populations driven by either of the two sequences. The essential dynamics can then be reduced further, to a set of iterated adaptation-maps. This enables detailed analysis, resulting in the prediction of phase-diagrams of possible sequence-pair patterns and their response to perturbations. These predictions invite a variety of future experiments
    corecore