5,745 research outputs found

    Synchronization of coupled neural oscillators with heterogeneous delays

    Full text link
    We investigate the effects of heterogeneous delays in the coupling of two excitable neural systems. Depending upon the coupling strengths and the time delays in the mutual and self-coupling, the compound system exhibits different types of synchronized oscillations of variable period. We analyze this synchronization based on the interplay of the different time delays and support the numerical results by analytical findings. In addition, we elaborate on bursting-like dynamics with two competing timescales on the basis of the autocorrelation function.Comment: 18 pages, 14 figure

    Heterogeneous Delays in Neural Networks

    Full text link
    We investigate heterogeneous coupling delays in complex networks of excitable elements described by the FitzHugh-Nagumo model. The effects of discrete as well as of uni- and bimodal continuous distributions are studied with a focus on different topologies, i.e., regular, small-world, and random networks. In the case of two discrete delay times resonance effects play a major role: Depending on the ratio of the delay times, various characteristic spiking scenarios, such as coherent or asynchronous spiking, arise. For continuous delay distributions different dynamical patterns emerge depending on the width of the distribution. For small distribution widths, we find highly synchronized spiking, while for intermediate widths only spiking with low degree of synchrony persists, which is associated with traveling disruptions, partial amplitude death, or subnetwork synchronization, depending sensitively on the network topology. If the inhomogeneity of the coupling delays becomes too large, global amplitude death is induced

    Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

    Full text link
    Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.Comment: (Under review

    Six networks on a universal neuromorphic computing substrate

    Get PDF
    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality
    corecore