88 research outputs found

    Dynamical Systems with Applications using Python

    Get PDF
    This textbook provides a broad introduction to continuous and discrete dynamical systems. With its hands-on approach, the text leads the reader from basic theory to recently published research material in nonlinear ordinary differential equations, nonlinear optics, multifractals, neural networks, and binary oscillator computing. Dynamical Systems with Applications Using Python takes advantage of Python’s extensive visualization, simulation, and algorithmic tools to study those topics in nonlinear dynamical systems through numerical algorithms and generated diagrams. After a tutorial introduction to Python, the first part of the book deals with continuous systems using differential equations, including both ordinary and delay differential equations. The second part of the book deals with discrete dynamical systems and progresses to the study of both continuous and discrete systems in contexts like chaos control and synchronization, neural networks, and binary oscillator computing. These later sections are useful reference material for undergraduate student projects. The book is rounded off with example coursework to challenge students’ programming abilities and Python-based exam questions. This book will appeal to advanced undergraduate and graduate students, applied mathematicians, engineers, and researchers in a range of disciplines, such as biology, chemistry, computing, economics, and physics. Since it provides a survey of dynamical systems, a familiarity with linear algebra, real and complex analysis, calculus, and ordinary differential equations is necessary, and knowledge of a programming language like C or Java is beneficial but not essential

    Interpreting multi-stable behaviour in input-driven recurrent neural networks

    Get PDF
    Recurrent neural networks (RNNs) are computational models inspired by the brain. Although RNNs stand out as state-of-the-art machine learning models to solve challenging tasks as speech recognition, handwriting recognition, language translation, and others, they are plagued by the so-called vanishing/exploding gradient issue. This prevents us from training RNNs with the aim of learning long term dependencies in sequential data. Moreover, a problem of interpretability affects these models, known as the ``black-box issue'' of RNNs. We attempt to open the black box by developing a mechanistic interpretation of errors occurring during the computation. We do this from a dynamical system theory perspective, specifically building on the notion of Excitable Network Attractors. Our methodology is effective at least for those tasks where a number of attractors and a switching pattern between them must be learned. RNNs can be seen as massively large nonlinear dynamical systems driven by external inputs. When it comes to analytically investigate RNNs, often in the literature the input-driven property is neglected or dropped in favour of tight constraints on the input driving the dynamics, which do not match the reality of RNN applications. Trying to bridge this gap, we framed RNNs dynamics driven by generic input sequences in the context of nonautonomous dynamical system theory. This brought us to enquire deeply into a fundamental principle established for RNNs known as the echo state property (ESP). In particular, we argue that input-driven RNNs can be reliable computational models even without satisfying the classical ESP formulation. We prove a sort of input-driven fixed point theorem and exploit it to (i) demonstrate the existence and uniqueness of a global attracting solution for strongly (in amplitude) input-driven RNNs, (ii) deduce the existence of multiple responses for certain input signals which can be reliably exploited for computational purposes, and (iii) study the stability of attracting solutions w.r.t. input sequences. Finally, we highlight the active role of the input in determining qualitative changes in the RNN dynamics, e.g. the number of stable responses, in contrast to commonly known qualitative changes due to variations of model parameters

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA

    分散耐環境ナノ電子デバイスの研究

    Get PDF

    18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems: Proceedings

    Get PDF
    Proceedings of the 18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems, which took place in Dresden, Germany, 26 – 28 May 2010.:Welcome Address ........................ Page I Table of Contents ........................ Page III Symposium Committees .............. Page IV Special Thanks ............................. Page V Conference program (incl. page numbers of papers) ................... Page VI Conference papers Invited talks ................................ Page 1 Regular Papers ........................... Page 14 Wednesday, May 26th, 2010 ......... Page 15 Thursday, May 27th, 2010 .......... Page 110 Friday, May 28th, 2010 ............... Page 210 Author index ............................... Page XII

    On the interaction of gamma-rhythmic neuronal populations

    Full text link
    Local gamma-band (~30-100Hz) oscillations in the brain, produced by feedback inhibition on a characteristic timescale, appear in multiple areas of the brain and are associated with a wide range of cognitive functions. Some regions producing gamma also receive gamma-rhythmic input, and the interaction and coordination of these rhythms has been hypothesized to serve various functional roles. This thesis consists of three stand-alone chapters, each of which considers the response of a gamma-rhythmic neuronal circuit to input in an analytical framework. In the first, we demonstrate that several related models of a gamma-generating circuit under periodic forcing are asymptotically drawn onto an attracting invariant torus due to the convergence of inhibition trajectories at spikes and the convergence of voltage trajectories during sustained inhibition, and therefore display a restricted range of dynamics. In the second, we show that a model of a gamma-generating circuit under forcing by square pulses cannot maintain multiple stably phase-locked solutions. In the third, we show that a separation of time scales of membrane potential dynamics and synaptic decay causes the gamma model to phase align its spiking such that periodic forcing pulses arrive under minimal inhibition. When two of these models are mutually coupled, the same effect causes excitatory pulses from the faster oscillator to arrive at the slower under minimal inhibition, while pulses from the slower to the faster arrive under maximal inhibition. We also show that such a time scale separation allows the model to respond sensitively to input pulse coherence to an extent that is not possible for a simple one-dimensional oscillator. We draw on a wide range of mathematical tools and structures including return maps, saltation matrices, contraction methods, phase response formalism, and singular perturbation theory in order to show that the neuronal mechanism of gamma oscillations is uniquely suited to reliably phase lock across brain regions and facilitate the selective transmission of information

    The Dynamics of Coupled Oscillators

    No full text
    The subject is introduced by considering the treatment of oscillators in Mathematics from the simple Poincar´e oscillator, a single variable dynamical process defined on a circle, to the oscillatory dynamics of systems of differential equations. Some models of real oscillator systems are considered. Noise processes are included in the dynamics of the system. Coupling between oscillators is investigated both in terms of analytical systems and as coupled oscillator models. It is seen that driven oscillators can be used as a model of 2 coupled oscillators in 2 and 3 dimensions due to the dependence of the dynamics on the phase difference of the oscillators. This means that the dynamics are easily able to be modelled by a 1D or 2D map. The analysis of N coupled oscillator systems is also described. The human cardiovascular system is studied as an example of a coupled oscillator system. The heart oscillator system is described by a system of delay differential equations and the dynamics characterised. The mechanics of the coupling with the respiration is described. In particular the model of the heart oscillator includes the baroreceptor reflex with time delay whereby the aortic fluid pressure influences the heart rate and the peripheral resistance. Respiration is modelled as forcing the heart oscillator system. Locking zones caused by respiratory sinus arrhythmia (RSA), the synchronisation of the heart with respiration, are found by plotting the rotation number against respiration frequency. These are seen to be relatively narrow for typical physiological parameters and only occur for low ratios of heart rate to respiration frequency. Plots of the diastolic pressure and heart interval in terms of respiration phase parameterised by respiration frequency illustrate the dynamics of synchronisation in the human cardiovascular system
    corecore