1,905 research outputs found

    Geometric Analysis of Synchronization in Neuronal Networks with Global Inhibition and Coupling Delays

    Get PDF
    We study synaptically coupled neuronal networks to identify the role of coupling delays in network's synchronized behaviors. We consider a network of excitable, relaxation oscillator neurons where two distinct populations, one excitatory and one inhibitory, are coupled and interact with each other. The excitatory population is uncoupled, while the inhibitory population is tightly coupled. A geometric singular perturbation analysis yields existence and stability conditions for synchronization states under different firing patterns between the two populations, along with formulas for the periods of such synchronous solutions. Our results demonstrate that the presence of coupling delays in the network promotes synchronization. Numerical simulations are conducted to supplement and validate analytical results. We show the results carry over to a model for spindle sleep rhythms in thalamocortical networks, one of the biological systems which motivated our study. The analysis helps to explain how coupling delays in either excitatory or inhibitory synapses contribute to producing synchronized rhythms.Comment: 43 pages, 12 figure

    Topological exploration of artificial neuronal network dynamics

    Full text link
    One of the paramount challenges in neuroscience is to understand the dynamics of individual neurons and how they give rise to network dynamics when interconnected. Historically, researchers have resorted to graph theory, statistics, and statistical mechanics to describe the spatiotemporal structure of such network dynamics. Our novel approach employs tools from algebraic topology to characterize the global properties of network structure and dynamics. We propose a method based on persistent homology to automatically classify network dynamics using topological features of spaces built from various spike-train distances. We investigate the efficacy of our method by simulating activity in three small artificial neural networks with different sets of parameters, giving rise to dynamics that can be classified into four regimes. We then compute three measures of spike train similarity and use persistent homology to extract topological features that are fundamentally different from those used in traditional methods. Our results show that a machine learning classifier trained on these features can accurately predict the regime of the network it was trained on and also generalize to other networks that were not presented during training. Moreover, we demonstrate that using features extracted from multiple spike-train distances systematically improves the performance of our method

    Frequency control in synchronized networks of inhibitory neurons

    Full text link
    We analyze the control of frequency for a synchronized inhibitory neuronal network. The analysis is done for a reduced membrane model with a biophysically-based synaptic influence. We argue that such a reduced model can quantitatively capture the frequency behavior of a larger class of neuronal models. We show that in different parameter regimes, the network frequency depends in different ways on the intrinsic and synaptic time constants. Only in one portion of the parameter space, called `phasic', is the network period proportional to the synaptic decay time. These results are discussed in connection with previous work of the authors, which showed that for mildly heterogeneous networks, the synchrony breaks down, but coherence is preserved much more for systems in the phasic regime than in the other regimes. These results imply that for mildly heterogeneous networks, the existence of a coherent rhythm implies a linear dependence of the network period on synaptic decay time, and a much weaker dependence on the drive to the cells. We give experimental evidence for this conclusion.Comment: 18 pages, 3 figures, Kluwer.sty. J. Comp. Neurosci. (in press). Originally submitted to the neuro-sys archive which was never publicly announced (was 9803001

    Rhythms of the nervous system: mathematical themes and variations

    Full text link
    The nervous system displays a variety of rhythms in both waking and sleep. These rhythms have been closely associated with different behavioral and cognitive states, but it is still unknown how the nervous system makes use of these rhythms to perform functionally important tasks. To address those questions, it is first useful to understood in a mechanistic way the origin of the rhythms, their interactions, the signals which create the transitions among rhythms, and the ways in which rhythms filter the signals to a network of neurons. This talk discusses how dynamical systems have been used to investigate the origin, properties and interactions of rhythms in the nervous system. It focuses on how the underlying physiology of the cells and synapses of the networks shape the dynamics of the network in different contexts, allowing the variety of dynamical behaviors to be displayed by the same network. The work is presented using a series of related case studies on different rhythms. These case studies are chosen to highlight mathematical issues, and suggest further mathematical work to be done. The topics include: different roles of excitation and inhibition in creating synchronous assemblies of cells, different kinds of building blocks for neural oscillations, and transitions among rhythms. The mathematical issues include reduction of large networks to low dimensional maps, role of noise, global bifurcations, use of probabilistic formulations.Published versio

    Evolution of network structure by temporal learning

    Full text link
    We study the effect of learning dynamics on network topology. A network of discrete dynamical systems is considered for this purpose and the coupling strengths are made to evolve according to a temporal learning rule that is based on the paradigm of spike-time-dependent plasticity. This incorporates necessary competition between different edges. The final network we obtain is robust and has a broad degree distribution.Comment: revised manuscript in communicatio

    STDP-driven networks and the \emph{C. elegans} neuronal network

    Full text link
    We study the dynamics of the structure of a formal neural network wherein the strengths of the synapses are governed by spike-timing-dependent plasticity (STDP). For properly chosen input signals, there exists a steady state with a residual network. We compare the motif profile of such a network with that of a real neural network of \emph{C. elegans} and identify robust qualitative similarities. In particular, our extensive numerical simulations show that this STDP-driven resulting network is robust under variations of the model parameters.Comment: 16 pages, 14 figure

    The effects of noise on binocular rivalry waves: a stochastic neural field model

    Get PDF
    We analyse the effects of extrinsic noise on traveling waves of visual perception in a competitive neural field model of binocular rivalry. The model consists of two one-dimensional excitatory neural fields, whose activity variables represent the responses to left-eye and right-eye stimuli, respectively. The two networks mutually inhibit each other, and slow adaptation is incorporated into the model by taking the network connections to exhibit synaptic depression. We first show how, in the absence of any noise, the system supports a propagating composite wave consisting of an invading activity front in one network co-moving with a retreating front in the other network. Using a separation of time scales and perturbation methods previously developed for stochastic reaction-diffusion equations, we then show how multiplicative noise in the activity variables leads to a diffusive–like displacement (wandering) of the composite wave from its uniformly translating position at long time scales, and fluctuations in the wave profile around its instantaneous position at short time scales. The multiplicative noise also renormalizes the mean speed of the wave. We use our analysis to calculate the first passage time distribution for a stochastic rivalry wave to travel a fixed distance, which we find to be given by an inverse Gaussian. Finally, we investigate the effects of noise in the depression variables, which under an adiabatic approximation leads to quenched disorder in the neural fields during propagation of a wave

    The Utility of Phase Models in Studying Neural Synchronization

    Full text link
    Synchronized neural spiking is associated with many cognitive functions and thus, merits study for its own sake. The analysis of neural synchronization naturally leads to the study of repetitive spiking and consequently to the analysis of coupled neural oscillators. Coupled oscillator theory thus informs the synchronization of spiking neuronal networks. A crucial aspect of coupled oscillator theory is the phase response curve (PRC), which describes the impact of a perturbation to the phase of an oscillator. In neural terms, the perturbation represents an incoming synaptic potential which may either advance or retard the timing of the next spike. The phase response curves and the form of coupling between reciprocally coupled oscillators defines the phase interaction function, which in turn predicts the synchronization outcome (in-phase versus anti-phase) and the rate of convergence. We review the two classes of PRC and demonstrate the utility of the phase model in predicting synchronization in reciprocally coupled neural models. In addition, we compare the rate of convergence for all combinations of reciprocally coupled Class I and Class II oscillators. These findings predict the general synchronization outcomes of broad classes of neurons under both inhibitory and excitatory reciprocal coupling.Comment: 18 pages, 5 figure

    Attentional modulation of firing rate and synchrony in a model cortical network

    Get PDF
    When attention is directed into the receptive field of a V4 neuron, its contrast response curve is shifted to lower contrast values (Reynolds et al, 2000, Neuron 26:703). Attention also increases the coherence between neurons responding to the same stimulus (Fries et al, 2001, Science 291:1560). We studied how the firing rate and synchrony of a densely interconnected cortical network varied with contrast and how they were modulated by attention. We found that an increased driving current to the excitatory neurons increased the overall firing rate of the network, whereas variation of the driving current to inhibitory neurons modulated the synchrony of the network. We explain the synchrony modulation in terms of a locking phenomenon during which the ratio of excitatory to inhibitory firing rates is approximately constant for a range of driving current values. We explored the hypothesis that contrast is represented primarily as a drive to the excitatory neurons, whereas attention corresponds to a reduction in driving current to the inhibitory neurons. Using this hypothesis, the model reproduces the following experimental observations: (1) the firing rate of the excitatory neurons increases with contrast; (2) for high contrast stimuli, the firing rate saturates and the network synchronizes; (3) attention shifts the contrast response curve to lower contrast values; (4) attention leads to stronger synchronization that starts at a lower value of the contrast compared with the attend-away condition. In addition, it predicts that attention increases the delay between the inhibitory and excitatory synchronous volleys produced by the network, allowing the stimulus to recruit more downstream neurons.Comment: 36 pages, submitted to Journal of Computational Neuroscienc
    • …
    corecore