9,359 research outputs found

    Connectivity Influences on Nonlinear Dynamics in Weakly-Synchronized Networks: Insights from Rössler Systems, Electronic Chaotic Oscillators, Model and Biological Neurons

    Get PDF
    Natural and engineered networks, such as interconnected neurons, ecological and social networks, coupled oscillators, wireless terminals and power loads, are characterized by an appreciable heterogeneity in the local connectivity around each node. For instance, in both elementary structures such as stars and complex graphs having scale-free topology, a minority of elements are linked to the rest of the network disproportionately strongly. While the effect of the arrangement of structural connections on the emergent synchronization pattern has been studied extensively, considerably less is known about its influence on the temporal dynamics unfolding within each node. Here, we present a comprehensive investigation across diverse simulated and experimental systems, encompassing star and complex networks of Rössler systems, coupled hysteresis-based electronic oscillators, microcircuits of leaky integrate-and-fire model neurons, and finally recordings from in-vitro cultures of spontaneously-growing neuronal networks. We systematically consider a range of dynamical measures, including the correlation dimension, nonlinear prediction error, permutation entropy, and other information-theoretical indices. The empirical evidence gathered reveals that under situations of weak synchronization, wherein rather than a collective behavior one observes significantly differentiated dynamics, denser connectivity tends to locally promote the emergence of stronger signatures of nonlinear dynamics. In deterministic systems, transition to chaos and generation of higher-dimensional signals were observed; however, when the coupling is stronger, this relationship may be lost or even inverted. In systems with a strong stochastic component, the generation of more temporally-organized activity could be induced. These observations have many potential implications across diverse fields of basic and applied science, for example, in the design of distributed sensing systems based on wireless coupled oscillators, in network identification and control, as well as in the interpretation of neuroscientific and other dynamical data

    Eigenvector Synchronization, Graph Rigidity and the Molecule Problem

    Full text link
    The graph realization problem has received a great deal of attention in recent years, due to its importance in applications such as wireless sensor networks and structural biology. In this paper, we extend on previous work and propose the 3D-ASAP algorithm, for the graph realization problem in R3\mathbb{R}^3, given a sparse and noisy set of distance measurements. 3D-ASAP is a divide and conquer, non-incremental and non-iterative algorithm, which integrates local distance information into a global structure determination. Our approach starts with identifying, for every node, a subgraph of its 1-hop neighborhood graph, which can be accurately embedded in its own coordinate system. In the noise-free case, the computed coordinates of the sensors in each patch must agree with their global positioning up to some unknown rigid motion, that is, up to translation, rotation and possibly reflection. In other words, to every patch there corresponds an element of the Euclidean group Euc(3) of rigid transformations in R3\mathbb{R}^3, and the goal is to estimate the group elements that will properly align all the patches in a globally consistent way. Furthermore, 3D-ASAP successfully incorporates information specific to the molecule problem in structural biology, in particular information on known substructures and their orientation. In addition, we also propose 3D-SP-ASAP, a faster version of 3D-ASAP, which uses a spectral partitioning algorithm as a preprocessing step for dividing the initial graph into smaller subgraphs. Our extensive numerical simulations show that 3D-ASAP and 3D-SP-ASAP are very robust to high levels of noise in the measured distances and to sparse connectivity in the measurement graph, and compare favorably to similar state-of-the art localization algorithms.Comment: 49 pages, 8 figure

    Experimental analysis and computational modeling of interburst intervals in spontaneous activity of cortical neuronal culture

    Get PDF
    Rhythmic bursting is the most striking behavior of cultured cortical networks and may start in the second week after plating. In this study, we focus on the intervals between spontaneously occurring bursts, and compare experimentally recorded values with model simulations. In the models, we use standard neurons and synapses, with physiologically plausible parameters taken from literature. All networks had a random recurrent architecture with sparsely connected neurons. The number of neurons varied between 500 and 5,000. We find that network models with homogeneous synaptic strengths produce asynchronous spiking or stable regular bursts. The latter, however, are in a range not seen in recordings. By increasing the synaptic strength in a (randomly chosen) subset of neurons, our simulations show interburst intervals (IBIs) that agree better with in vitro experiments. In this regime, called weakly synchronized, the models produce irregular network bursts, which are initiated by neurons with relatively stronger synapses. In some noise-driven networks, a subthreshold, deterministic, input is applied to neurons with strong synapses, to mimic pacemaker network drive. We show that models with such “intrinsically active neurons” (pacemaker-driven models) tend to generate IBIs that are determined by the frequency of the fastest pacemaker and do not resemble experimental data. Alternatively, noise-driven models yield realistic IBIs. Generally, we found that large-scale noise-driven neuronal network models required synaptic strengths with a bimodal distribution to reproduce the experimentally observed IBI range. Our results imply that the results obtained from small network models cannot simply be extrapolated to models of more realistic size. Synaptic strengths in large-scale neuronal network simulations need readjustment to a bimodal distribution, whereas small networks do not require such change

    The geometry of spontaneous spiking in neuronal networks

    Full text link
    The mathematical theory of pattern formation in electrically coupled networks of excitable neurons forced by small noise is presented in this work. Using the Freidlin-Wentzell large deviation theory for randomly perturbed dynamical systems and the elements of the algebraic graph theory, we identify and analyze the main regimes in the network dynamics in terms of the key control parameters: excitability, coupling strength, and network topology. The analysis reveals the geometry of spontaneous dynamics in electrically coupled network. Specifically, we show that the location of the minima of a certain continuous function on the surface of the unit n-cube encodes the most likely activity patterns generated by the network. By studying how the minima of this function evolve under the variation of the coupling strength, we describe the principal transformations in the network dynamics. The minimization problem is also used for the quantitative description of the main dynamical regimes and transitions between them. In particular, for the weak and strong coupling regimes, we present asymptotic formulas for the network activity rate as a function of the coupling strength and the degree of the network. The variational analysis is complemented by the stability analysis of the synchronous state in the strong coupling regime. The stability estimates reveal the contribution of the network connectivity and the properties of the cycle subspace associated with the graph of the network to its synchronization properties. This work is motivated by the experimental and modeling studies of the ensemble of neurons in the Locus Coeruleus, a nucleus in the brainstem involved in the regulation of cognitive performance and behavior

    Distributed Decision Through Self-Synchronizing Sensor Networks in the Presence of Propagation Delays and Asymmetric Channels

    Full text link
    In this paper we propose and analyze a distributed algorithm for achieving globally optimal decisions, either estimation or detection, through a self-synchronization mechanism among linearly coupled integrators initialized with local measurements. We model the interaction among the nodes as a directed graph with weights (possibly) dependent on the radio channels and we pose special attention to the effect of the propagation delay occurring in the exchange of data among sensors, as a function of the network geometry. We derive necessary and sufficient conditions for the proposed system to reach a consensus on globally optimal decision statistics. One of the major results proved in this work is that a consensus is reached with exponential convergence speed for any bounded delay condition if and only if the directed graph is quasi-strongly connected. We provide a closed form expression for the global consensus, showing that the effect of delays is, in general, the introduction of a bias in the final decision. Finally, we exploit our closed form expression to devise a double-step consensus mechanism able to provide an unbiased estimate with minimum extra complexity, without the need to know or estimate the channel parameters.Comment: To be published on IEEE Transactions on Signal Processin
    corecore