95 research outputs found

    Synchrony and bifurcations in coupled dynamical systems and effects of time delay

    Get PDF
    Dynamik auf Netzwerken ist ein mathematisches Feld, das in den letzten Jahrzehnten schnell gewachsen ist und Anwendungen in zahlreichen Disziplinen wie z.B. Physik, Biologie und Soziologie findet. Die Funktion vieler Netzwerke hängt von der Fähigkeit ab, die Elemente des Netzwerkes zu synchronisieren. Mit anderen Worten, die Existenz und die transversale Stabilität der synchronen Mannigfaltigkeit sind zentrale Eigenschaften. Erst seit einigen Jahren wird versucht, den verwickelten Zusammenhang zwischen der Kopplungsstruktur und den Stabilitätseigenschaften synchroner Zustände zu verstehen. Genau das ist das zentrale Thema dieser Arbeit. Zunächst präsentiere ich erste Ergebnisse zur Klassifizierung der Kanten eines gerichteten Netzwerks bezüglich ihrer Bedeutung für die Stabilität des synchronen Zustands. Folgend untersuche ich ein komplexes Verzweigungsszenario in einem gerichteten Ring von Stuart-Landau Oszillatoren und zeige, dass das Szenario persistent ist, wenn dem Netzwerk eine schwach gewichtete Kante hinzugefügt wird. Daraufhin untersuche ich synchrone Zustände in Ringen von Phasenoszillatoren die mit Zeitverzögerung gekoppelt sind. Ich bespreche die Koexistenz synchroner Lösungen und analysiere deren Stabilität und Verzweigungen. Weiter zeige ich, dass eine Zeitverschiebung genutzt werden kann, um Muster im Ring zu speichern und wiederzuerkennen. Diese Zeitverschiebung untersuche ich daraufhin für beliebige Kopplungsstrukturen. Ich zeige, dass invariante Mannigfaltigkeiten des Flusses sowie ihre Stabilität unter der Zeitverschiebung erhalten bleiben. Darüber hinaus bestimme ich die minimale Anzahl von Zeitverzögerungen, die gebraucht werden, um das System äquivalent zu beschreiben. Schließlich untersuche ich das auffällige Phänomen eines nichtstetigen Übergangs zu Synchronizität in Klassen großer Zufallsnetzwerke indem ich einen kürzlich eingeführten Zugang zur Beschreibung großer Zufallsnetzwerke auf den Fall zeitverzögerter Kopplungen verallgemeinere.Since a couple of decades, dynamics on networks is a rapidly growing branch of mathematics with applications in various disciplines such as physics, biology or sociology. The functioning of many networks heavily relies on the ability to synchronize the network’s nodes. More precisely, the existence and the transverse stability of the synchronous manifold are essential properties. It was only in the last few years that people tried to understand the entangled relation between the coupling structure of a network, given by a (di-)graph, and the stability properties of synchronous states. This is the central theme of this dissertation. I first present results towards a classification of the links in a directed, diffusive network according to their impact on the stability of synchronization. Then I investigate a complex bifurcation scenario observed in a directed ring of Stuart-Landau oscillators. I show that under the addition of a single weak link, this scenario is persistent. Subsequently, I investigate synchronous patterns in a directed ring of phase oscillators coupled with time delay. I discuss the coexistence of multiple of synchronous solutions and investigate their stability and bifurcations. I apply these results by showing that a certain time-shift transformation can be used in order to employ the ring as a pattern recognition device. Next, I investigate the same time-shift transformation for arbitrary coupling structures in a very general setting. I show that invariant manifolds of the flow together with their stability properties are conserved under the time-shift transformation. Furthermore, I determine the minimal number of delays needed to equivalently describe the system’s dynamics. Finally, I investigate a peculiar phenomenon of non-continuous transition to synchrony observed in certain classes of large random networks, generalizing a recently introduced approach for the description of large random networks to the case of delayed couplings

    Towards a continuous dynamic model of the Hopfield theory on neuronal interaction and memory storage

    Get PDF
    The purpose of this work is to study the Hopfield model for neuronal interaction and memory storage, in particular the convergence to the stored patterns. Since the hypothesis of symmetric synapses is not true for the brain, we will study how we can extend it to the case of asymmetric synapses using a probabilistic approach. We then focus on the description of another feature of the memory process and brain: oscillations. Using the Kuramoto model we will be able to describe them completely, gaining the presence of synchronization between neurons. Our aim is therefore to understand how and why neurons can be seen as oscillators and to establish a strong link between this model and the Hopfield approach

    Dynamical Systems in Spiking Neuromorphic Hardware

    Get PDF
    Dynamical systems are universal computers. They can perceive stimuli, remember, learn from feedback, plan sequences of actions, and coordinate complex behavioural responses. The Neural Engineering Framework (NEF) provides a general recipe to formulate models of such systems as coupled sets of nonlinear differential equations and compile them onto recurrently connected spiking neural networks – akin to a programming language for spiking models of computation. The Nengo software ecosystem supports the NEF and compiles such models onto neuromorphic hardware. In this thesis, we analyze the theory driving the success of the NEF, and expose several core principles underpinning its correctness, scalability, completeness, robustness, and extensibility. We also derive novel theoretical extensions to the framework that enable it to far more effectively leverage a wide variety of dynamics in digital hardware, and to exploit the device-level physics in analog hardware. At the same time, we propose a novel set of spiking algorithms that recruit an optimal nonlinear encoding of time, which we call the Delay Network (DN). Backpropagation across stacked layers of DNs dramatically outperforms stacked Long Short-Term Memory (LSTM) networks—a state-of-the-art deep recurrent architecture—in accuracy and training time, on a continuous-time memory task, and a chaotic time-series prediction benchmark. The basic component of this network is shown to function on state-of-the-art spiking neuromorphic hardware including Braindrop and Loihi. This implementation approaches the energy-efficiency of the human brain in the former case, and the precision of conventional computation in the latter case

    29th Annual Computational Neuroscience Meeting: CNS*2020

    Get PDF
    Meeting abstracts This publication was funded by OCNS. The Supplement Editors declare that they have no competing interests. Virtual | 18-22 July 202
    • …
    corecore