164 research outputs found

    Principles of Neuromorphic Photonics

    Full text link
    In an age overrun with information, the ability to process reams of data has become crucial. The demand for data will continue to grow as smart gadgets multiply and become increasingly integrated into our daily lives. Next-generation industries in artificial intelligence services and high-performance computing are so far supported by microelectronic platforms. These data-intensive enterprises rely on continual improvements in hardware. Their prospects are running up against a stark reality: conventional one-size-fits-all solutions offered by digital electronics can no longer satisfy this need, as Moore's law (exponential hardware scaling), interconnection density, and the von Neumann architecture reach their limits. With its superior speed and reconfigurability, analog photonics can provide some relief to these problems; however, complex applications of analog photonics have remained largely unexplored due to the absence of a robust photonic integration industry. Recently, the landscape for commercially-manufacturable photonic chips has been changing rapidly and now promises to achieve economies of scale previously enjoyed solely by microelectronics. The scientific community has set out to build bridges between the domains of photonic device physics and neural networks, giving rise to the field of \emph{neuromorphic photonics}. This article reviews the recent progress in integrated neuromorphic photonics. We provide an overview of neuromorphic computing, discuss the associated technology (microelectronic and photonic) platforms and compare their metric performance. We discuss photonic neural network approaches and challenges for integrated neuromorphic photonic processors while providing an in-depth description of photonic neurons and a candidate interconnection architecture. We conclude with a future outlook of neuro-inspired photonic processing.Comment: 28 pages, 19 figure

    Synchronization in model networks of class I neurons

    Get PDF
    We study a modification of the Hoppensteadt-Izhikevich canonical model for networks of class I neurons, in which the 'pulse' emitted by a neuron is smooth rather than a delta-function. We prove two types of results about synchronization and desynchronization of such networks, the first type pertaining to 'pulse' functions which are symmetric, and the other type in the regime in which each neuron is connected to many other neurons

    Weakly pulse-coupled oscillators, FM interactions, synchronization, and oscillatory associative memory

    Full text link

    Pulse shape and voltage-dependent synchronization in spiking neuron networks

    Full text link
    Pulse-coupled spiking neural networks are a powerful tool to gain mechanistic insights into how neurons self-organize to produce coherent collective behavior. These networks use simple spiking neuron models, such as the θ\theta-neuron or the quadratic integrate-and-fire (QIF) neuron, that replicate the essential features of real neural dynamics. Interactions between neurons are modeled with infinitely narrow pulses, or spikes, rather than the more complex dynamics of real synapses. To make these networks biologically more plausible, it has been proposed that they must also account for the finite width of the pulses, which can have a significant impact on the network dynamics. However, the derivation and interpretation of these pulses is contradictory and the impact of the pulse shape on the network dynamics is largely unexplored. Here, I take a comprehensive approach to pulse-coupling in networks of QIF and θ\theta-neurons. I argue that narrow pulses activate voltage-dependent synaptic conductances and show how to implement them in QIF neurons such that their effect can last through the phase after the spike. Using an exact low-dimensional description for networks of globally coupled spiking neurons, I prove for instantaneous interactions that collective oscillations emerge due to an effective coupling through the mean voltage. I analyze the impact of the pulse shape by means of a family of smooth pulse functions with arbitrary finite width and symmetric or asymmetric shapes. For symmetric pulses, the resulting voltage-coupling is little effective in synchronizing neurons, but pulses that are slightly skewed to the phase after the spike readily generate collective oscillations. The results unveil a voltage-dependent spike synchronization mechanism in neural networks, which is facilitated by pulses of finite width and complementary to traditional synaptic transmission.Comment: 38 pages, 11 figure

    <p>Pattern Formation in Coupled Networks with Inhibition and Gap Junctions</p>

    Get PDF
    In this dissertation we analyze networks of coupled phase oscillators. We consider systems where long range chemical coupling and short range electrical coupling have opposite effects on the synchronization process. We look at the existence and stability of three patterns of activity: synchrony, clustered state and asynchrony. In Chapter 1, we develop a minimal phase model using experimental results for the olfactory system of Limax. We study the synchronous solution as the strength of synaptic coupling increases. We explain the emergence of traveling waves in the system without a frequency gradient. We construct the normal form for the pitchfork bifurcation and compare our analytical results with numerical simulations. In Chapter 2, we study a mean-field coupled network of phase oscillators for which a stable two-cluster solution exists. The addition of nearest neighbor gap junction coupling destroys the stability of the cluster solution. When the gap junction coupling is strong there is a series of traveling wave solutions depending on the size of the network. We see bistability in the system between clustered state, periodic solutions and traveling waves. The bistability properties also change with the network size. We analyze the system numerically and analytically. In Chapter 3, we turn our attention to a very popular model about network synchronization. We represent the Kuramoto model in its original form and calculate the main results using a different technique. We also look at a modified version and study how this effects synchronization. We consider a collection of oscillators organized in m groups. The addition of gap junctions creates a wave like behavior

    Integrate and Fire Neural Networks, Piecewise Contractive Maps and Limit Cycles

    Full text link
    We study the global dynamics of integrate and fire neural networks composed of an arbitrary number of identical neurons interacting by inhibition and excitation. We prove that if the interactions are strong enough, then the support of the stable asymptotic dynamics consists of limit cycles. We also find sufficient conditions for the synchronization of networks containing excitatory neurons. The proofs are based on the analysis of the equivalent dynamics of a piecewise continuous Poincar\'e map associated to the system. We show that for strong interactions the Poincar\'e map is piecewise contractive. Using this contraction property, we prove that there exist a countable number of limit cycles attracting all the orbits dropping into the stable subset of the phase space. This result applies not only to the Poincar\'e map under study, but also to a wide class of general n-dimensional piecewise contractive maps.Comment: 46 pages. In this version we added many comments suggested by the referees all along the paper, we changed the introduction and the section containing the conclusions. The final version will appear in Journal of Mathematical Biology of SPRINGER and will be available at http://www.springerlink.com/content/0303-681

    Stability of synchronization under stochastic perturbations in leaky integrate and fire neural networks of finite size.

    Get PDF
    International audienceIn the present paper, we study the synchronization in a model of neural network which can be considered as a noisy version of the model of \citet{mirollo1990synchronization}, namely, fully-connected and totally excitatory integrate and fire neural network with Gaussian white noises. Using a large deviation principle, we prove the stability of the synchronized state under stochastic perturbations. Then, we give a lower bound on the probability of synchronization for networks which are not initially synchronized. This bound shows the robustness of the emergence of synchronization in presence of small stochastic perturbations

    Improving ‘Objective’ Digital Images with Neuronal Processing: A Computational Approach

    Full text link
    • …
    corecore