1,364 research outputs found

    Data based identification and prediction of nonlinear and complex dynamical systems

    Get PDF
    We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.Peer reviewedPostprin

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and FundaciĂłn BBVA

    Invariant template matching in systems with spatiotemporal coding: a vote for instability

    Full text link
    We consider the design of a pattern recognition that matches templates to images, both of which are spatially sampled and encoded as temporal sequences. The image is subject to a combination of various perturbations. These include ones that can be modeled as parameterized uncertainties such as image blur, luminance, translation, and rotation as well as unmodeled ones. Biological and neural systems require that these perturbations be processed through a minimal number of channels by simple adaptation mechanisms. We found that the most suitable mathematical framework to meet this requirement is that of weakly attracting sets. This framework provides us with a normative and unifying solution to the pattern recognition problem. We analyze the consequences of its explicit implementation in neural systems. Several properties inherent to the systems designed in accordance with our normative mathematical argument coincide with known empirical facts. This is illustrated in mental rotation, visual search and blur/intensity adaptation. We demonstrate how our results can be applied to a range of practical problems in template matching and pattern recognition.Comment: 52 pages, 12 figure

    Revealing networks from dynamics: an introduction

    Full text link
    What can we learn from the collective dynamics of a complex network about its interaction topology? Taking the perspective from nonlinear dynamics, we briefly review recent progress on how to infer structural connectivity (direct interactions) from accessing the dynamics of the units. Potential applications range from interaction networks in physics, to chemical and metabolic reactions, protein and gene regulatory networks as well as neural circuits in biology and electric power grids or wireless sensor networks in engineering. Moreover, we briefly mention some standard ways of inferring effective or functional connectivity.Comment: Topical review, 48 pages, 7 figure

    Optimal Subharmonic Entrainment

    Full text link
    For many natural and engineered systems, a central function or design goal is the synchronization of one or more rhythmic or oscillating processes to an external forcing signal, which may be periodic on a different time-scale from the actuated process. Such subharmonic synchrony, which is dynamically established when N control cycles occur for every M cycles of a forced oscillator, is referred to as N:M entrainment. In many applications, entrainment must be established in an optimal manner, for example by minimizing control energy or the transient time to phase locking. We present a theory for deriving inputs that establish subharmonic N:M entrainment of general nonlinear oscillators, or of collections of rhythmic dynamical units, while optimizing such objectives. Ordinary differential equation models of oscillating systems are reduced to phase variable representations, each of which consists of a natural frequency and phase response curve. Formal averaging and the calculus of variations are then applied to such reduced models in order to derive optimal subharmonic entrainment waveforms. The optimal entrainment of a canonical model for a spiking neuron is used to illustrate this approach, which is readily extended to arbitrary oscillating systems

    Control and Synchronization of Neuron Ensembles

    Full text link
    Synchronization of oscillations is a phenomenon prevalent in natural, social, and engineering systems. Controlling synchronization of oscillating systems is motivated by a wide range of applications from neurological treatment of Parkinson's disease to the design of neurocomputers. In this article, we study the control of an ensemble of uncoupled neuron oscillators described by phase models. We examine controllability of such a neuron ensemble for various phase models and, furthermore, study the related optimal control problems. In particular, by employing Pontryagin's maximum principle, we analytically derive optimal controls for spiking single- and two-neuron systems, and analyze the applicability of the latter to an ensemble system. Finally, we present a robust computational method for optimal control of spiking neurons based on pseudospectral approximations. The methodology developed here is universal to the control of general nonlinear phase oscillators.Comment: 29 pages, 6 figure

    Nonlinear brain dynamics as macroscopic manifestation of underlying many-body field dynamics

    Full text link
    Neural activity patterns related to behavior occur at many scales in time and space from the atomic and molecular to the whole brain. Here we explore the feasibility of interpreting neurophysiological data in the context of many-body physics by using tools that physicists have devised to analyze comparable hierarchies in other fields of science. We focus on a mesoscopic level that offers a multi-step pathway between the microscopic functions of neurons and the macroscopic functions of brain systems revealed by hemodynamic imaging. We use electroencephalographic (EEG) records collected from high-density electrode arrays fixed on the epidural surfaces of primary sensory and limbic areas in rabbits and cats trained to discriminate conditioned stimuli (CS) in the various modalities. High temporal resolution of EEG signals with the Hilbert transform gives evidence for diverse intermittent spatial patterns of amplitude (AM) and phase modulations (PM) of carrier waves that repeatedly re-synchronize in the beta and gamma ranges at near zero time lags over long distances. The dominant mechanism for neural interactions by axodendritic synaptic transmission should impose distance-dependent delays on the EEG oscillations owing to finite propagation velocities. It does not. EEGs instead show evidence for anomalous dispersion: the existence in neural populations of a low velocity range of information and energy transfers, and a high velocity range of the spread of phase transitions. This distinction labels the phenomenon but does not explain it. In this report we explore the analysis of these phenomena using concepts of energy dissipation, the maintenance by cortex of multiple ground states corresponding to AM patterns, and the exclusive selection by spontaneous breakdown of symmetry (SBS) of single states in sequences.Comment: 31 page

    Synchronization in complex networks

    Get PDF
    Synchronization processes in populations of locally interacting elements are in the focus of intense research in physical, biological, chemical, technological and social systems. The many efforts devoted to understand synchronization phenomena in natural systems take now advantage of the recent theory of complex networks. In this review, we report the advances in the comprehension of synchronization phenomena when oscillating elements are constrained to interact in a complex network topology. We also overview the new emergent features coming out from the interplay between the structure and the function of the underlying pattern of connections. Extensive numerical work as well as analytical approaches to the problem are presented. Finally, we review several applications of synchronization in complex networks to different disciplines: biological systems and neuroscience, engineering and computer science, and economy and social sciences.Comment: Final version published in Physics Reports. More information available at http://synchronets.googlepages.com
    • …
    corecore