9,143 research outputs found

    Average activity of excitatory and inhibitory neural populations

    Get PDF
    We develop an extension of the Ott-Antonsen method [E. Ott and T. M. Antonsen, Chaos 18(3), 037113 (2008)] that allows obtaining the mean activity (spiking rate) of a population of excitable units. By means of the Ott-Antonsen method, equations for the dynamics of the order parameters of coupled excitatory and inhibitory populations of excitable units are obtained, and their mean activities are computed. Two different excitable systems are studied: Adler units and theta neurons. The resulting bifurcation diagrams are compared with those obtained from studying the phenomenological Wilson-Cowan model in some regions of the parameter space. Compatible behaviors, as well as higher dimensional chaotic solutions, are observed. We study numerical simulations to further validate the equations.Fil: Roulet, Javier. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Ciudad Universitaria. Instituto de FĂ­sica de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de FĂ­sica de Buenos Aires; ArgentinaFil: Mindlin, Bernardo Gabriel. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Ciudad Universitaria. Instituto de FĂ­sica de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de FĂ­sica de Buenos Aires; Argentin

    Dynamically-Coupled Oscillators -- Cooperative Behavior via Dynamical Interaction --

    Full text link
    We propose a theoretical framework to study the cooperative behavior of dynamically coupled oscillators (DCOs) that possess dynamical interactions. Then, to understand synchronization phenomena in networks of interneurons which possess inhibitory interactions, we propose a DCO model with dynamics of interactions that tend to cause 180-degree phase lags. Employing an approach developed here, we demonstrate that although our model displays synchronization at high frequencies, it does not exhibit synchronization at low frequencies because this dynamical interaction does not cause a phase lag sufficiently large to cancel the effect of the inhibition. We interpret the disappearance of synchronization in our model with decreasing frequency as describing the breakdown of synchronization in the interneuron network of the CA1 area below the critical frequency of 20 Hz.Comment: 10 pages, 3 figure

    Training Echo State Networks with Regularization through Dimensionality Reduction

    Get PDF
    In this paper we introduce a new framework to train an Echo State Network to predict real valued time-series. The method consists in projecting the output of the internal layer of the network on a space with lower dimensionality, before training the output layer to learn the target task. Notably, we enforce a regularization constraint that leads to better generalization capabilities. We evaluate the performances of our approach on several benchmark tests, using different techniques to train the readout of the network, achieving superior predictive performance when using the proposed framework. Finally, we provide an insight on the effectiveness of the implemented mechanics through a visualization of the trajectory in the phase space and relying on the methodologies of nonlinear time-series analysis. By applying our method on well known chaotic systems, we provide evidence that the lower dimensional embedding retains the dynamical properties of the underlying system better than the full-dimensional internal states of the network

    Models wagging the dog: are circuits constructed with disparate parameters?

    Get PDF
    In a recent article, Prinz, Bucher, and Marder (2004) addressed the fundamental question of whether neural systems are built with a fixed blueprint of tightly controlled parameters or in a way in which properties can vary largely from one individual to another, using a database modeling approach. Here, we examine the main conclusion that neural circuits indeed are built with largely varying parameters in the light of our own experimental and modeling observations. We critically discuss the experimental and theoretical evidence, including the general adequacy of database approaches for questions of this kind, and come to the conclusion that the last word for this fundamental question has not yet been spoken
    • …
    corecore