3,140 research outputs found

    Global computation of phase-amplitude reduction for limit-cycle dynamics

    Get PDF
    Recent years have witnessed increasing interest to phase-amplitude reduction of limit-cycle dynamics. Adding an amplitude coordinate to the phase coordinate allows to take into account the dynamics transversal to the limit cycle and thereby overcomes the main limitations of classic phase reduction (strong convergence to the limit cycle and weak inputs). While previous studies mostly focus on local quantities such as infinitesimal responses, a major and limiting challenge of phase-amplitude reduction is to compute amplitude coordinates globally, in the basin of attraction of the limit cycle. In this paper, we propose a method to compute the full set of phase-amplitude coordinates in the large. Our method is based on the so-called Koopman (composition) operator and aims at computing the eigenfunctions of the operator through Laplace averages (in combination with the harmonic balance method). This yields a forward integration method that is not limited to two-dimensional systems. We illustrate the method by computing the so-called isostables of limit cycles in two, three, and four-dimensional state spaces, as well as their responses to strong external inputs.Comment: 26 page

    Structure Learning in Coupled Dynamical Systems and Dynamic Causal Modelling

    Get PDF
    Identifying a coupled dynamical system out of many plausible candidates, each of which could serve as the underlying generator of some observed measurements, is a profoundly ill posed problem that commonly arises when modelling real world phenomena. In this review, we detail a set of statistical procedures for inferring the structure of nonlinear coupled dynamical systems (structure learning), which has proved useful in neuroscience research. A key focus here is the comparison of competing models of (ie, hypotheses about) network architectures and implicit coupling functions in terms of their Bayesian model evidence. These methods are collectively referred to as dynamical casual modelling (DCM). We focus on a relatively new approach that is proving remarkably useful; namely, Bayesian model reduction (BMR), which enables rapid evaluation and comparison of models that differ in their network architecture. We illustrate the usefulness of these techniques through modelling neurovascular coupling (cellular pathways linking neuronal and vascular systems), whose function is an active focus of research in neurobiology and the imaging of coupled neuronal systems

    An isostable coordinate based amelioration strategy to mitigate the effects of Jet lag

    Get PDF
    Commercial air travel has become extremely commonplace in the last 20 to 30 years especially as the world has moved towards new heights of globalization. Though air travel has greatly reduced transit times allowing people to cover thousand of miles within hours, it comes with its fair share of issues. jet-lag can be regarded to be at the top of those list of problems; jet-lag typically results from rapid travel through multiple time zones which causes a significant misalignment between the person\u27s internal circadian clock and the external time. A person\u27s circadian clock is governed by a population of coupled neurons entrained to a 24-hour light and dark cycle and thus after rapid air travel, the neuron population needs a certain time to get accustomed to the new time zone. This misalignment can result in a variety of health problems including, but not limited to, lethargy, insomnia and adverse effects to the sleep cycle. Various techniques have been proposed and are currently in use for jet-lag treatment like melatonin ingestion or making drastic changes to one\u27s own routine prior to air travel. However, these treatment strategies are normally accompanied with long re-entrainment times or following a strict schedule to help with correcting the sleep cycle. The presented work explores an alternate strategy for jet-lag treatment using the notion of operational phase and isostable coordinates for model reduction and then, applying optimal control to derive inputs which can be applied directly to the model. To show the framework\u27s efficacy, results are presented by applying the strategy to a 2-d model; preliminary results show that the proposed approach greatly reduces the reentrainment time required to acclimatize to the new time zone

    Synchronization and Redundancy: Implications for Robustness of Neural Learning and Decision Making

    Full text link
    Learning and decision making in the brain are key processes critical to survival, and yet are processes implemented by non-ideal biological building blocks which can impose significant error. We explore quantitatively how the brain might cope with this inherent source of error by taking advantage of two ubiquitous mechanisms, redundancy and synchronization. In particular we consider a neural process whose goal is to learn a decision function by implementing a nonlinear gradient dynamics. The dynamics, however, are assumed to be corrupted by perturbations modeling the error which might be incurred due to limitations of the biology, intrinsic neuronal noise, and imperfect measurements. We show that error, and the associated uncertainty surrounding a learned solution, can be controlled in large part by trading off synchronization strength among multiple redundant neural systems against the noise amplitude. The impact of the coupling between such redundant systems is quantified by the spectrum of the network Laplacian, and we discuss the role of network topology in synchronization and in reducing the effect of noise. A range of situations in which the mechanisms we model arise in brain science are discussed, and we draw attention to experimental evidence suggesting that cortical circuits capable of implementing the computations of interest here can be found on several scales. Finally, simulations comparing theoretical bounds to the relevant empirical quantities show that the theoretical estimates we derive can be tight.Comment: Preprint, accepted for publication in Neural Computatio
    • …
    corecore