2,683 research outputs found
Emergence of global synchronization in directed excitatory networks of type I neurons
The collective behaviour of neural networks depends on the cellular and
synaptic properties of the neurons. The phase-response curve (PRC) is an
experimentally obtainable measure of cellular properties that quantifies the
shift in the next spike time of a neuron as a function of the phase at which
stimulus is delivered to that neuron. The neuronal PRCs can be classified as
having either purely positive values (type I) or distinct positive and negative
regions (type II). Networks of type 1 PRCs tend not to synchronize via mutual
excitatory synaptic connections. We study the synchronization properties of
identical type I and type II neurons, assuming unidirectional synapses.
Performing the linear stability analysis and the numerical simulation of the
extended Kuramoto model, we show that feedforward loop motifs favour
synchronization of type I excitatory and inhibitory neurons, while feedback
loop motifs destroy their synchronization tendency. Moreover, large directed
networks, either without feedback motifs or with many of them, have been
constructed from the same undirected backbones, and a high synchronization
level is observed for directed acyclic graphs with type I neurons. It has been
shown that, the synchronizability of type I neurons depends on both the
directionality of the network connectivity and the topology of its undirected
backbone. The abundance of feedforward motifs enhances the synchronizability of
the directed acyclic graphs
Principles of Neuromorphic Photonics
In an age overrun with information, the ability to process reams of data has
become crucial. The demand for data will continue to grow as smart gadgets
multiply and become increasingly integrated into our daily lives.
Next-generation industries in artificial intelligence services and
high-performance computing are so far supported by microelectronic platforms.
These data-intensive enterprises rely on continual improvements in hardware.
Their prospects are running up against a stark reality: conventional
one-size-fits-all solutions offered by digital electronics can no longer
satisfy this need, as Moore's law (exponential hardware scaling),
interconnection density, and the von Neumann architecture reach their limits.
With its superior speed and reconfigurability, analog photonics can provide
some relief to these problems; however, complex applications of analog
photonics have remained largely unexplored due to the absence of a robust
photonic integration industry. Recently, the landscape for
commercially-manufacturable photonic chips has been changing rapidly and now
promises to achieve economies of scale previously enjoyed solely by
microelectronics.
The scientific community has set out to build bridges between the domains of
photonic device physics and neural networks, giving rise to the field of
\emph{neuromorphic photonics}. This article reviews the recent progress in
integrated neuromorphic photonics. We provide an overview of neuromorphic
computing, discuss the associated technology (microelectronic and photonic)
platforms and compare their metric performance. We discuss photonic neural
network approaches and challenges for integrated neuromorphic photonic
processors while providing an in-depth description of photonic neurons and a
candidate interconnection architecture. We conclude with a future outlook of
neuro-inspired photonic processing.Comment: 28 pages, 19 figure
Storing cycles in Hopfield-type networks with pseudoinverse learning rule: admissibility and network topology
Cyclic patterns of neuronal activity are ubiquitous in animal nervous
systems, and partially responsible for generating and controlling rhythmic
movements such as locomotion, respiration, swallowing and so on. Clarifying the
role of the network connectivities for generating cyclic patterns is
fundamental for understanding the generation of rhythmic movements. In this
paper, the storage of binary cycles in neural networks is investigated. We call
a cycle admissible if a connectivity matrix satisfying the cycle's
transition conditions exists, and construct it using the pseudoinverse learning
rule. Our main focus is on the structural features of admissible cycles and
corresponding network topology. We show that is admissible if and only
if its discrete Fourier transform contains exactly nonzero
columns. Based on the decomposition of the rows of into loops, where a
loop is the set of all cyclic permutations of a row, cycles are classified as
simple cycles, separable or inseparable composite cycles. Simple cycles contain
rows from one loop only, and the network topology is a feedforward chain with
feedback to one neuron if the loop-vectors in are cyclic permutations
of each other. Composite cycles contain rows from at least two disjoint loops,
and the neurons corresponding to the rows in from the same loop are
identified with a cluster. Networks constructed from separable composite cycles
decompose into completely isolated clusters. For inseparable composite cycles
at least two clusters are connected, and the cluster-connectivity is related to
the intersections of the spaces spanned by the loop-vectors of the clusters.
Simulations showing successfully retrieved cycles in continuous-time
Hopfield-type networks and in networks of spiking neurons are presented.Comment: 48 pages, 3 figure
- …