27,617 research outputs found
Delay Learning Architectures for Memory and Classification
We present a neuromorphic spiking neural network, the DELTRON, that can
remember and store patterns by changing the delays of every connection as
opposed to modifying the weights. The advantage of this architecture over
traditional weight based ones is simpler hardware implementation without
multipliers or digital-analog converters (DACs) as well as being suited to
time-based computing. The name is derived due to similarity in the learning
rule with an earlier architecture called Tempotron. The DELTRON can remember
more patterns than other delay-based networks by modifying a few delays to
remember the most 'salient' or synchronous part of every spike pattern. We
present simulations of memory capacity and classification ability of the
DELTRON for different random spatio-temporal spike patterns. The memory
capacity for noisy spike patterns and missing spikes are also shown. Finally,
we present SPICE simulation results of the core circuits involved in a
reconfigurable mixed signal implementation of this architecture.Comment: 27 pages, 20 figure
Integer Echo State Networks: Hyperdimensional Reservoir Computing
We propose an approximation of Echo State Networks (ESN) that can be
efficiently implemented on digital hardware based on the mathematics of
hyperdimensional computing. The reservoir of the proposed Integer Echo State
Network (intESN) is a vector containing only n-bits integers (where n<8 is
normally sufficient for a satisfactory performance). The recurrent matrix
multiplication is replaced with an efficient cyclic shift operation. The intESN
architecture is verified with typical tasks in reservoir computing: memorizing
of a sequence of inputs; classifying time-series; learning dynamic processes.
Such an architecture results in dramatic improvements in memory footprint and
computational efficiency, with minimal performance loss.Comment: 10 pages, 10 figures, 1 tabl
- …