372 research outputs found

    Pacemaker Heterogeneity in the Suprachiasmatic Nucleus: Origins and Network Implications

    Get PDF
    In mammals, the suprachiasmatic nuclei: SCN) in the ventral hypothalamus function as a circadian pacemaker, controlling daily rhythms in behavior and physiology. Together the SCN contain approximately 20,000 neurons that maintain rhythms in firing rate and gene expression. Previous studies led to the assumption that single SCN neurons are capable of self-sustained circadian rhythms. Whether and which SCN neurons can maintain cell-autonomous daily oscillations has not been extensively tested. We measured PERIOD2::LUCIFERASE expression in isolated SCN neurons over multiple days to determine if all SCN neurons were circadian. We then examined neuropeptide content of the recorded neurons. We found that when isolated physically or with a blocker of cell-cell communication, SCN neurons expressed a range of circadian periods, amplitudes, and abilities to sustain cycling. Surprisingly, most cells were sloppy oscillators, switching from rhythmic to arrhythmic or vice versa throughout their lifetime. We also found no evidence for a class of circadian-pacemaker neurons in the SCN based on neuropeptide expression. We conclude that while all SCN neurons are capable of cell-autonomous rhythms, they are intrinsically sloppy with network interactions dramatically increasing the number of circadian neurons. We next used a mathematical model of the mammalian circadian clock to determine whether rates of gene transcription, protein translation, degradation or phosphorylation might explain the ability of SCN neurons to switch between circadian and arrhythmic behaviors. We found that rhythmicity was more sensitive to the rates of protein translation and degradation. We next tested what effect having neurons with different intrinsic circadian behaviors would have on population synchrony. We simulated cells of known circadian phenotypes: e.g. arrhythmic, damped, or self-sustained) in a pattern defined by small-world network properties and varied the positions and proportions of each oscillator type. We found that increasing the number of damped oscillators or placing them in highly connected locations within the network both augmented the rate at which the network synchronized. We conclude that the SCN likely benefit from a heterogeneous population of oscillators, especially when recovering from an environmental perturbation that causes desynchrony. Finally, we generated and characterized two independent lines of transgenic mice to test the role of vasoactive intestinal polypeptide: VIP) neurons in circadian rhythmicity. These mice express Yellow Fluorescent Protein: YFP) under the control of a fragment of the VIP promoter in VIP neurons of the SCN, neocortex, olfactory bulbs, and enteric nervous system. We crossed these mice to generate a line in which VIP neurons are targeted for deletion using Cre-mediated recombination upon addition of tamoxifen. We observed successful deletion of VIP neurons in cultured SCN explants, but have no evidence to date for deletion of SCN neurons in vivo using a variety of protocols. We conclude that our construct is faithfully expressed in VIP neurons and that in vitro experiments show promising results for further study

    One-Dimensional Population Density Approaches to Recurrently Coupled Networks of Neurons with Noise

    Get PDF
    Mean-field systems have been previously derived for networks of coupled, two-dimensional, integrate-and-fire neurons such as the Izhikevich, adapting exponential (AdEx) and quartic integrate and fire (QIF), among others. Unfortunately, the mean-field systems have a degree of frequency error and the networks analyzed often do not include noise when there is adaptation. Here, we derive a one-dimensional partial differential equation (PDE) approximation for the marginal voltage density under a first order moment closure for coupled networks of integrate-and-fire neurons with white noise inputs. The PDE has substantially less frequency error than the mean-field system, and provides a great deal more information, at the cost of analytical tractability. The convergence properties of the mean-field system in the low noise limit are elucidated. A novel method for the analysis of the stability of the asynchronous tonic firing solution is also presented and implemented. Unlike previous attempts at stability analysis with these network types, information about the marginal densities of the adaptation variables is used. This method can in principle be applied to other systems with nonlinear partial differential equations.Comment: 26 Pages, 6 Figure

    Leaders do not look back, or do they?

    Get PDF
    We study the effect of adding to a directed chain of interconnected systems a directed feedback from the last element in the chain to the first. The problem is closely related to the fundamental question of how a change in network topology may influence the behavior of coupled systems. We begin the analysis by investigating a simple linear system. The matrix that specifies the system dynamics is the transpose of the network Laplacian matrix, which codes the connectivity of the network. Our analysis shows that for any nonzero complex eigenvalue λ\lambda of this matrix, the following inequality holds: λλcotπn\frac{|\Im \lambda |}{|\Re \lambda |} \leq \cot\frac{\pi}{n}. This bound is sharp, as it becomes an equality for an eigenvalue of a simple directed cycle with uniform interaction weights. The latter has the slowest decay of oscillations among all other network configurations with the same number of states. The result is generalized to directed rings and chains of identical nonlinear oscillators. For directed rings, a lower bound σc\sigma_c for the connection strengths that guarantees asymptotic synchronization is found to follow a similar pattern: σc=11cos(2π/n)\sigma_c=\frac{1}{1-\cos\left( 2\pi /n\right)} . Numerical analysis revealed that, depending on the network size nn, multiple dynamic regimes co-exist in the state space of the system. In addition to the fully synchronous state a rotating wave solution occurs. The effect is observed in networks exceeding a certain critical size. The emergence of a rotating wave highlights the importance of long chains and loops in networks of oscillators: the larger the size of chains and loops, the more sensitive the network dynamics becomes to removal or addition of a single connection

    Dynamical Systems on Networks: A Tutorial

    Full text link
    We give a tutorial for the study of dynamical systems on networks. We focus especially on "simple" situations that are tractable analytically, because they can be very insightful and provide useful springboards for the study of more complicated scenarios. We briefly motivate why examining dynamical systems on networks is interesting and important, and we then give several fascinating examples and discuss some theoretical results. We also briefly discuss dynamical systems on dynamical (i.e., time-dependent) networks, overview software implementations, and give an outlook on the field.Comment: 39 pages, 1 figure, submitted, more examples and discussion than original version, some reorganization and also more pointers to interesting direction

    A Model of Stimulus-Specific Neural Assemblies in the Insect Antennal Lobe

    Get PDF
    It has been proposed that synchronized neural assemblies in the antennal lobe of insects encode the identity of olfactory stimuli. In response to an odor, some projection neurons exhibit synchronous firing, phase-locked to the oscillations of the field potential, whereas others do not. Experimental data indicate that neural synchronization and field oscillations are induced by fast GABAA-type inhibition, but it remains unclear how desynchronization occurs. We hypothesize that slow inhibition plays a key role in desynchronizing projection neurons. Because synaptic noise is believed to be the dominant factor that limits neuronal reliability, we consider a computational model of the antennal lobe in which a population of oscillatory neurons interact through unreliable GABAA and GABAB inhibitory synapses. From theoretical analysis and extensive computer simulations, we show that transmission failures at slow GABAB synapses make the neural response unpredictable. Depending on the balance between GABAA and GABAB inputs, particular neurons may either synchronize or desynchronize. These findings suggest a wiring scheme that triggers stimulus-specific synchronized assemblies. Inhibitory connections are set by Hebbian learning and selectively activated by stimulus patterns to form a spiking associative memory whose storage capacity is comparable to that of classical binary-coded models. We conclude that fast inhibition acts in concert with slow inhibition to reformat the glomerular input into odor-specific synchronized neural assemblies

    Recovery time after localized perturbations in complex dynamical networks

    Get PDF
    Maintaining the synchronous motion of dynamical systems interacting on complex networks is often critical to their functionality. However, real-world networked dynamical systems operating synchronously are prone to random perturbations driving the system to arbitrary states within the corresponding basin of attraction, thereby leading to epochs of desynchronized dynamics with a priori unknown durations. Thus, it is highly relevant to have an estimate of the duration of such transient phases before the system returns to synchrony, following a random perturbation to the dynamical state of any particular node of the network. We address this issue here by proposing the framework of single-node recovery time (SNRT) which provides an estimate of the relative time scales underlying the transient dynamics of the nodes of a network during its restoration to synchrony. We utilize this in differentiating the particularly slow nodes of the network from the relatively fast nodes, thus identifying the critical nodes which when perturbed lead to significantly enlarged recovery time of the system before resuming synchronized operation. Further, we reveal explicit relationships between the SNRT values of a network, and its global relaxation time when starting all the nodes from random initial conditions. Earlier work on relaxation time generally focused on investigating its dependence on macroscopic topological properties of the respective network. However, we employ the proposed concept for deducing microscopic relationships between topological features of nodes and their respective SNRT values. The framework of SNRT is further extended to a measure of resilience of the different nodes of a networked dynamical system. We demonstrate the potential of SNRT in networks of Rössler oscillators on paradigmatic topologies and a model of the power grid of the United Kingdom with second-order Kuramoto-type nodal dynamics illustrating the conceivable practical applicability of the proposed concept.Bundesministerium für Bildung und Forschunghttps://doi.org/10.13039/501100002347Deutsche Forschungsgemeinschafthttps://doi.org/10.13039/501100001659Peer Reviewe

    The Kuramoto model in complex networks

    Get PDF
    181 pages, 48 figures. In Press, Accepted Manuscript, Physics Reports 2015 Acknowledgments We are indebted with B. Sonnenschein, E. R. dos Santos, P. Schultz, C. Grabow, M. Ha and C. Choi for insightful and helpful discussions. T.P. acknowledges FAPESP (No. 2012/22160-7 and No. 2015/02486-3) and IRTG 1740. P.J. thanks founding from the China Scholarship Council (CSC). F.A.R. acknowledges CNPq (Grant No. 305940/2010-4) and FAPESP (Grants No. 2011/50761-2 and No. 2013/26416-9) for financial support. J.K. would like to acknowledge IRTG 1740 (DFG and FAPESP).Peer reviewedPreprin

    Information processing in biological complex systems: a view to bacterial and neural complexity

    Get PDF
    This thesis is a study of information processing of biological complex systems seen from the perspective of dynamical complexity (the degree of statistical independence of a system as a whole with respect to its components due to its causal structure). In particular, we investigate the influence of signaling functions in cell-to-cell communication in bacterial and neural systems. For each case, we determine the spatial and causal dependencies in the system dynamics from an information-theoretic point of view and we relate it with their physiological capabilities. The main research content is presented into three main chapters. First, we study a previous theoretical work on synchronization, multi-stability, and clustering of a population of coupled synthetic genetic oscillators via quorum sensing. We provide an extensive numerical analysis of the spatio-temporal interactions, and determine conditions in which the causal structure of the system leads to high dynamical complexity in terms of associated metrics. Our results indicate that this complexity is maximally receptive at transitions between dynamical regimes, and maximized for transient multi-cluster oscillations associated with chaotic behaviour. Next, we introduce a model of a neuron-astrocyte network with bidirectional coupling using glutamate-induced calcium signaling. This study is focused on the impact of the astrocyte-mediated potentiation on synaptic transmission. Our findings suggest that the information generated by the joint activity of the population of neurons is irreducible to its independent contribution due to the role of astrocytes. We relate these results with the shared information modulated by the spike synchronization imposed by the bidirectional feedback between neurons and astrocytes. It is shown that the dynamical complexity is maximized when there is a balance between the spike correlation and spontaneous spiking activity. Finally, the previous observations on neuron-glial signaling are extended to a large-scale system with community structure. Here we use a multi-scale approach to account for spatiotemporal features of astrocytic signaling coupled with clusters of neurons. We investigate the interplay of astrocytes and spiking-time-dependent-plasticity at local and global scales in the emergence of complexity and neuronal synchronization. We demonstrate the utility of astrocytes and learning in improving the encoding of external stimuli as well as its ability to favour the integration of information at synaptic timescales to exhibit a high intrinsic causal structure at the system level. Our proposed approach and observations point to potential effects of the astrocytes for sustaining more complex information processing in the neural circuitry
    corecore