4,566 research outputs found

    Noise-induced synchronization and anti-resonance in excitable systems; Implications for information processing in Parkinson's Disease and Deep Brain Stimulation

    Full text link
    We study the statistical physics of a surprising phenomenon arising in large networks of excitable elements in response to noise: while at low noise, solutions remain in the vicinity of the resting state and large-noise solutions show asynchronous activity, the network displays orderly, perfectly synchronized periodic responses at intermediate level of noise. We show that this phenomenon is fundamentally stochastic and collective in nature. Indeed, for noise and coupling within specific ranges, an asymmetry in the transition rates between a resting and an excited regime progressively builds up, leading to an increase in the fraction of excited neurons eventually triggering a chain reaction associated with a macroscopic synchronized excursion and a collective return to rest where this process starts afresh, thus yielding the observed periodic synchronized oscillations. We further uncover a novel anti-resonance phenomenon: noise-induced synchronized oscillations disappear when the system is driven by periodic stimulation with frequency within a specific range. In that anti-resonance regime, the system is optimal for measures of information capacity. This observation provides a new hypothesis accounting for the efficiency of Deep Brain Stimulation therapies in Parkinson's disease, a neurodegenerative disease characterized by an increased synchronization of brain motor circuits. We further discuss the universality of these phenomena in the class of stochastic networks of excitable elements with confining coupling, and illustrate this universality by analyzing various classical models of neuronal networks. Altogether, these results uncover some universal mechanisms supporting a regularizing impact of noise in excitable systems, reveal a novel anti-resonance phenomenon in these systems, and propose a new hypothesis for the efficiency of high-frequency stimulation in Parkinson's disease

    Noise induced processes in neural systems

    Get PDF
    Real neurons, and their networks, are far too complex to be described exactly by simple deterministic equations. Any description of their dynamics must therefore incorporate noise to some degree. It is my thesis that the nervous system is organized in such a way that its performance is optimal, subject to this constraint. I further contend that neuronal dynamics may even be enhanced by noise, when compared with their deterministic counter-parts. To support my thesis I will present and analyze three case studies. I will show how noise might (i) extend the dynamic range of mammalian cold-receptors and other cells that exhibit a temperature-dependent discharge; (ii) feature in the perception of ambiguous figures such as the Necker cube; (iii) alter the discharge pattern of single cells

    Boolean Delay Equations: A simple way of looking at complex systems

    Full text link
    Boolean Delay Equations (BDEs) are semi-discrete dynamical models with Boolean-valued variables that evolve in continuous time. Systems of BDEs can be classified into conservative or dissipative, in a manner that parallels the classification of ordinary or partial differential equations. Solutions to certain conservative BDEs exhibit growth of complexity in time. They represent therewith metaphors for biological evolution or human history. Dissipative BDEs are structurally stable and exhibit multiple equilibria and limit cycles, as well as more complex, fractal solution sets, such as Devil's staircases and ``fractal sunbursts``. All known solutions of dissipative BDEs have stationary variance. BDE systems of this type, both free and forced, have been used as highly idealized models of climate change on interannual, interdecadal and paleoclimatic time scales. BDEs are also being used as flexible, highly efficient models of colliding cascades in earthquake modeling and prediction, as well as in genetics. In this paper we review the theory of systems of BDEs and illustrate their applications to climatic and solid earth problems. The former have used small systems of BDEs, while the latter have used large networks of BDEs. We moreover introduce BDEs with an infinite number of variables distributed in space (``partial BDEs``) and discuss connections with other types of dynamical systems, including cellular automata and Boolean networks. This research-and-review paper concludes with a set of open questions.Comment: Latex, 67 pages with 15 eps figures. Revised version, in particular the discussion on partial BDEs is updated and enlarge

    Incremental embodied chaotic exploration of self-organized motor behaviors with proprioceptor adaptation

    Get PDF
    This paper presents a general and fully dynamic embodied artificial neural system, which incrementally explores and learns motor behaviors through an integrated combination of chaotic search and reflex learning. The former uses adaptive bifurcation to exploit the intrinsic chaotic dynamics arising from neuro-body-environment interactions, while the latter is based around proprioceptor adaptation. The overall iterative search process formed from this combination is shown to have a close relationship to evolutionary methods. The architecture developed here allows realtime goal-directed exploration and learning of the possible motor patterns (e.g., for locomotion) of embodied systems of arbitrary morphology. Examples of its successful application to a simple biomechanical model, a simulated swimming robot, and a simulated quadruped robot are given. The tractability of the biomechanical systems allows detailed analysis of the overall dynamics of the search process. This analysis sheds light on the strong parallels with evolutionary search

    Representation of Dynamical Stimuli in Populations of Threshold Neurons

    Get PDF
    Many sensory or cognitive events are associated with dynamic current modulations in cortical neurons. This raises an urgent demand for tractable model approaches addressing the merits and limits of potential encoding strategies. Yet, current theoretical approaches addressing the response to mean- and variance-encoded stimuli rarely provide complete response functions for both modes of encoding in the presence of correlated noise. Here, we investigate the neuronal population response to dynamical modifications of the mean or variance of the synaptic bombardment using an alternative threshold model framework. In the variance and mean channel, we provide explicit expressions for the linear and non-linear frequency response functions in the presence of correlated noise and use them to derive population rate response to step-like stimuli. For mean-encoded signals, we find that the complete response function depends only on the temporal width of the input correlation function, but not on other functional specifics. Furthermore, we show that both mean- and variance-encoded signals can relay high-frequency inputs, and in both schemes step-like changes can be detected instantaneously. Finally, we obtain the pairwise spike correlation function and the spike triggered average from the linear mean-evoked response function. These results provide a maximally tractable limiting case that complements and extends previous results obtained in the integrate and fire framework

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA

    The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields

    Get PDF
    The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences

    Inhibitory synchrony as a mechanism for attentional gain modulation

    Get PDF
    Recordings from area V4 of monkeys have revealed that when the focus of attention is on a visual stimulus within the receptive field of a cortical neuron, two distinct changes can occur: The firing rate of the neuron can change and there can be an increase in the coherence between spikes and the local field potential in the gamma-frequency range (30-50 Hz). The hypothesis explored here is that these observed effects of attention could be a consequence of changes in the synchrony of local interneuron networks. We performed computer simulations of a Hodgkin-Huxley type neuron driven by a constant depolarizing current, I, representing visual stimulation and a modulatory inhibitory input representing the effects of attention via local interneuron networks. We observed that the neuron's firing rate and the coherence of its output spike train with the synaptic inputs was modulated by the degree of synchrony of the inhibitory inputs. The model suggest that the observed changes in firing rate and coherence of neurons in the visual cortex could be controlled by top-down inputs that regulated the coherence in the activity of a local inhibitory network discharging at gamma frequencies.Comment: J.Physiology (Paris) in press, 11 figure

    Mecanismos de codificación y procesamiento de información en redes basadas en firmas neuronales

    Full text link
    Tesis doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Tecnología Electrónica y de las Comunicaciones. Fecha de lectura: 21-02-202
    corecore