1,276 research outputs found
Principles of Neuromorphic Photonics
In an age overrun with information, the ability to process reams of data has
become crucial. The demand for data will continue to grow as smart gadgets
multiply and become increasingly integrated into our daily lives.
Next-generation industries in artificial intelligence services and
high-performance computing are so far supported by microelectronic platforms.
These data-intensive enterprises rely on continual improvements in hardware.
Their prospects are running up against a stark reality: conventional
one-size-fits-all solutions offered by digital electronics can no longer
satisfy this need, as Moore's law (exponential hardware scaling),
interconnection density, and the von Neumann architecture reach their limits.
With its superior speed and reconfigurability, analog photonics can provide
some relief to these problems; however, complex applications of analog
photonics have remained largely unexplored due to the absence of a robust
photonic integration industry. Recently, the landscape for
commercially-manufacturable photonic chips has been changing rapidly and now
promises to achieve economies of scale previously enjoyed solely by
microelectronics.
The scientific community has set out to build bridges between the domains of
photonic device physics and neural networks, giving rise to the field of
\emph{neuromorphic photonics}. This article reviews the recent progress in
integrated neuromorphic photonics. We provide an overview of neuromorphic
computing, discuss the associated technology (microelectronic and photonic)
platforms and compare their metric performance. We discuss photonic neural
network approaches and challenges for integrated neuromorphic photonic
processors while providing an in-depth description of photonic neurons and a
candidate interconnection architecture. We conclude with a future outlook of
neuro-inspired photonic processing.Comment: 28 pages, 19 figure
StdpC: a modern dynamic clamp
With the advancement of computer technology many novel uses of dynamic clamp have become possible. We have added new features to our dynamic clamp software StdpC (âSpike timing-dependent plasticity Clampâ) allowing such new applications while conserving the ease of use and installation of the popular earlier Dynclamp 2/4 package. Here, we introduce the new features of a waveform generator, freely programmable HodgkinâHuxley conductances, learning synapses, graphic data displays, and a powerful scripting mechanism and discuss examples of experiments using these features. In the first example we built and âvoltage clampedâ a conductance based model cell from a passive resistorâcapacitor (RC) circuit using the dynamic clamp software to generate the voltage-dependent currents. In the second example we coupled our new spike generator through a burst detection/burst generation mechanism in a phase-dependent way to a neuron in a central pattern generator and dissected the subtle interaction between neurons, which seems to implement an information transfer through intraburst spike patterns. In the third example, making use of the new plasticity mechanism for simulated synapses, we analyzed the effect of spike timing-dependent plasticity (STDP) on synchronization revealing considerable enhancement of the entrainment of a post-synaptic neuron by a periodic spike train. These examples illustrate that with modern dynamic clamp software like StdpC, the dynamic clamp has developed beyond the mere introduction of artificial synapses or ionic conductances into neurons to a universal research tool, which might well become a standard instrument of modern electrophysiology
A caloritronics-based Mott neuristor
Machine learning imitates the basic features of biological neural networks to
efficiently perform tasks such as pattern recognition. This has been mostly
achieved at a software level, and a strong effort is currently being made to
mimic neurons and synapses with hardware components, an approach known as
neuromorphic computing. CMOS-based circuits have been used for this purpose,
but they are non-scalable, limiting the device density and motivating the
search for neuromorphic materials. While recent advances in resistive switching
have provided a path to emulate synapses at the 10 nm scale, a scalable neuron
analogue is yet to be found. Here, we show how heat transfer can be utilized to
mimic neuron functionalities in Mott nanodevices. We use the Joule heating
created by current spikes to trigger the insulator-to-metal transition in a
biased VO2 nanogap. We show that thermal dynamics allow the implementation of
the basic neuron functionalities: activity, leaky integrate-and-fire,
volatility and rate coding. By using local temperature as the internal
variable, we avoid the need of external capacitors, which reduces neuristor
size by several orders of magnitude. This approach could enable neuromorphic
hardware to take full advantage of the rapid advances in memristive synapses,
allowing for much denser and complex neural networks. More generally, we show
that heat dissipation is not always an undesirable effect: it can perform
computing tasks if properly engineered
- âŠ