1,266 research outputs found

    Optogenetic perturbations reveal the dynamics of an oculomotor integrator

    Get PDF
    Many neural systems can store short-term information in persistently firing neurons. Such persistent activity is believed to be maintained by recurrent feedback among neurons. This hypothesis has been fleshed out in detail for the oculomotor integrator (OI) for which the so-called “line attractor” network model can explain a large set of observations. Here we show that there is a plethora of such models, distinguished by the relative strength of recurrent excitation and inhibition. In each model, the firing rates of the neurons relax toward the persistent activity states. The dynamics of relaxation can be quite different, however, and depend on the levels of recurrent excitation and inhibition. To identify the correct model, we directly measure these relaxation dynamics by performing optogenetic perturbations in the OI of zebrafish expressing halorhodopsin or channelrhodopsin. We show that instantaneous, inhibitory stimulations of the OI lead to persistent, centripetal eye position changes ipsilateral to the stimulation. Excitatory stimulations similarly cause centripetal eye position changes, yet only contralateral to the stimulation. These results show that the dynamics of the OI are organized around a central attractor state—the null position of the eyes—which stabilizes the system against random perturbations. Our results pose new constraints on the circuit connectivity of the system and provide new insights into the mechanisms underlying persistent activity

    Mutual information in changing environments: non-linear interactions, out-of-equilibrium systems, and continuously-varying diffusivities

    Full text link
    Biochemistry, ecology, and neuroscience are examples of prominent fields aiming at describing interacting systems that exhibit non-trivial couplings to complex, ever-changing environments. We have recently shown that linear interactions and a switching environment are encoded separately in the mutual information of the overall system. Here, we first generalize these findings to a broad class of non-linear interacting models. We find that a new term in the mutual information appears, quantifying the interplay between non-linear interactions and environmental changes, and leading to either constructive or destructive information interference. Furthermore, we show that a higher mutual information emerges in out-of-equilibrium environments with respect to an equilibrium scenario. Finally, we generalize our framework to the case of continuously varying environments. We find that environmental changes can be mapped exactly into an effective spatially-varying diffusion coefficient, shedding light on modeling and information structure of biophysical systems in inhomogeneous media

    Capacitance fluctuations causing channel noise reduction in stochastic Hodgkin-Huxley systems

    Full text link
    Voltage-dependent ion channels determine the electric properties of axonal cell membranes. They not only allow the passage of ions through the cell membrane but also contribute to an additional charging of the cell membrane resulting in the so-called capacitance loading. The switching of the channel gates between an open and a closed configuration is intrinsically related to the movement of gating charge within the cell membrane. At the beginning of an action potential the transient gating current is opposite to the direction of the current of sodium ions through the membrane. Therefore, the excitability is expected to become reduced due to the influence of a gating current. Our stochastic Hodgkin-Huxley like modeling takes into account both the channel noise -- i.e. the fluctuations of the number of open ion channels -- and the capacitance fluctuations that result from the dynamics of the gating charge. We investigate the spiking dynamics of membrane patches of variable size and analyze the statistics of the spontaneous spiking. As a main result, we find that the gating currents yield a drastic reduction of the spontaneous spiking rate for sufficiently large ion channel clusters. Consequently, this demonstrates a prominent mechanism for channel noise reduction.Comment: 18 page

    Short-term synaptic facilitation improves information retrieval in noisy neural networks

    Full text link
    Short-term synaptic depression and facilitation have been found to greatly influence the performance of autoassociative neural networks. However, only partial results, focused for instance on the computation of the maximum storage capacity at zero temperature, have been obtained to date. In this work, we extended the study of the effect of these synaptic mechanisms on autoassociative neural networks to more realistic and general conditions, including the presence of noise in the system. In particular, we characterized the behavior of the system by means of its phase diagrams, and we concluded that synaptic facilitation significantly enlarges the region of good retrieval performance of the network. We also found that networks with facilitating synapses may have critical temperatures substantially higher than those of standard autoassociative networks, thus allowing neural networks to perform better under high-noise conditions.Comment: 6 pages, 3 figures, to appear in EP

    Firing statistics and correlations in spiking neurons: a level-crossing approach

    Get PDF
    We present a time-dependent level-crossing theory for linear dynamical systems perturbed by colored Gaussian noise. We apply these results to approximate the firing statistics of conductance-based integrate-and-fire neurons receiving excitatory and inhibitory Poissonian inputs. Analytical expressions are obtained for three key quantities characterizing the neuronal response to time-varying inputs: the mean firing rate, the linear response to sinusoidally-modulated inputs, and the pairwise spike-correlation for neurons receiving correlated inputs. The theory yields tractable results that are shown to accurately match numerical simulations, and provides useful tools for the analysis of interconnected neuronal populations

    Stochastic synchronization of neuronal populations with intrinsic and extrinsic noise

    Get PDF
    We extend the theory of noise-induced phase synchronization to the case of a neural master equation describing the stochastic dynamics of an ensemble of uncoupled neuronal population oscillators with intrinsic and extrinsic noise. The master equation formulation of stochastic neurodynamics represents the state of each population by the number of currently active neurons, and the state transitions are chosen so that deterministic Wilson-Cowan rate equations are recovered in the mean-field limit. We apply phase reduction and averaging methods to a corresponding Langevin approximation of the master equation in order to determine how intrinsic noise disrupts synchronization of the population oscillators driven by a common extrinsic noise source. We illustrate our analysis by considering one of the simplest networks known to generate limit cycle oscillations at the population level, namely, a pair of mutually coupled excitatory (E) and inhibitory (I) subpopulations. We show how the combination of intrinsic independent noise and extrinsic common noise can lead to clustering of the population oscillators due to the multiplicative nature of both noise sources under the Langevin approximation. Finally, we show how a similar analysis can be carried out for another simple population model that exhibits limit cycle oscillations in the deterministic limit, namely, a recurrent excitatory network with synaptic depression; inclusion of synaptic depression into the neural master equation now generates a stochastic hybrid system

    Reconstructing Dynamical Systems From Stochastic Differential Equations to Machine Learning

    Get PDF
    Die Modellierung komplexer Systeme mit einer großen Anzahl von Freiheitsgraden ist in den letzten Jahrzehnten zu einer großen Herausforderung geworden. In der Regel werden nur einige wenige Variablen komplexer Systeme in Form von gemessenen Zeitreihen beobachtet, während die meisten von ihnen - die möglicherweise mit den beobachteten Variablen interagieren - verborgen bleiben. In dieser Arbeit befassen wir uns mit dem Problem der Rekonstruktion und Vorhersage der zugrunde liegenden Dynamik komplexer Systeme mit Hilfe verschiedener datengestützter Ansätze. Im ersten Teil befassen wir uns mit dem umgekehrten Problem der Ableitung einer unbekannten Netzwerkstruktur komplexer Systeme, die Ausbreitungsphänomene widerspiegelt, aus beobachteten Ereignisreihen. Wir untersuchen die paarweise statistische Ähnlichkeit zwischen den Sequenzen von Ereigniszeitpunkten an allen Knotenpunkten durch Ereignissynchronisation (ES) und Ereignis-Koinzidenz-Analyse (ECA), wobei wir uns auf die Idee stützen, dass funktionale Konnektivität als Stellvertreter für strukturelle Konnektivität dienen kann. Im zweiten Teil konzentrieren wir uns auf die Rekonstruktion der zugrunde liegenden Dynamik komplexer Systeme anhand ihrer dominanten makroskopischen Variablen unter Verwendung verschiedener stochastischer Differentialgleichungen (SDEs). In dieser Arbeit untersuchen wir die Leistung von drei verschiedenen SDEs - der Langevin-Gleichung (LE), der verallgemeinerten Langevin-Gleichung (GLE) und dem Ansatz der empirischen Modellreduktion (EMR). Unsere Ergebnisse zeigen, dass die LE bessere Ergebnisse für Systeme mit schwachem Gedächtnis zeigt, während sie die zugrunde liegende Dynamik von Systemen mit Gedächtniseffekten und farbigem Rauschen nicht rekonstruieren kann. In diesen Situationen sind GLE und EMR besser geeignet, da die Wechselwirkungen zwischen beobachteten und unbeobachteten Variablen in Form von Speichereffekten berücksichtigt werden. Im letzten Teil dieser Arbeit entwickeln wir ein Modell, das auf dem Echo State Network (ESN) basiert und mit der PNF-Methode (Past Noise Forecasting) kombiniert wird, um komplexe Systeme in der realen Welt vorherzusagen. Unsere Ergebnisse zeigen, dass das vorgeschlagene Modell die entscheidenden Merkmale der zugrunde liegenden Dynamik der Klimavariabilität erfasst.Modeling complex systems with large numbers of degrees of freedom have become a grand challenge over the past decades. Typically, only a few variables of complex systems are observed in terms of measured time series, while the majority of them – which potentially interact with the observed ones - remain hidden. Throughout this thesis, we tackle the problem of reconstructing and predicting the underlying dynamics of complex systems using different data-driven approaches. In the first part, we address the inverse problem of inferring an unknown network structure of complex systems, reflecting spreading phenomena, from observed event series. We study the pairwise statistical similarity between the sequences of event timings at all nodes through event synchronization (ES) and event coincidence analysis (ECA), relying on the idea that functional connectivity can serve as a proxy for structural connectivity. In the second part, we focus on reconstructing the underlying dynamics of complex systems from their dominant macroscopic variables using different Stochastic Differential Equations (SDEs). We investigate the performance of three different SDEs – the Langevin Equation (LE), Generalized Langevin Equation (GLE), and the Empirical Model Reduction (EMR) approach in this thesis. Our results reveal that LE demonstrates better results for systems with weak memory while it fails to reconstruct underlying dynamics of systems with memory effects and colored-noise forcing. In these situations, the GLE and EMR are more suitable candidates since the interactions between observed and unobserved variables are considered in terms of memory effects. In the last part of this thesis, we develop a model based on the Echo State Network (ESN), combined with the past noise forecasting (PNF) method, to predict real-world complex systems. Our results show that the proposed model captures the crucial features of the underlying dynamics of climate variability
    corecore