1,757 research outputs found
Integer Echo State Networks: Hyperdimensional Reservoir Computing
We propose an approximation of Echo State Networks (ESN) that can be
efficiently implemented on digital hardware based on the mathematics of
hyperdimensional computing. The reservoir of the proposed Integer Echo State
Network (intESN) is a vector containing only n-bits integers (where n<8 is
normally sufficient for a satisfactory performance). The recurrent matrix
multiplication is replaced with an efficient cyclic shift operation. The intESN
architecture is verified with typical tasks in reservoir computing: memorizing
of a sequence of inputs; classifying time-series; learning dynamic processes.
Such an architecture results in dramatic improvements in memory footprint and
computational efficiency, with minimal performance loss.Comment: 10 pages, 10 figures, 1 tabl
Connectionist-Symbolic Machine Intelligence using Cellular Automata based Reservoir-Hyperdimensional Computing
We introduce a novel framework of reservoir computing, that is capable of
both connectionist machine intelligence and symbolic computation. Cellular
automaton is used as the reservoir of dynamical systems. Input is randomly
projected onto the initial conditions of automaton cells and nonlinear
computation is performed on the input via application of a rule in the
automaton for a period of time. The evolution of the automaton creates a
space-time volume of the automaton state space, and it is used as the
reservoir. The proposed framework is capable of long short-term memory and it
requires orders of magnitude less computation compared to Echo State Networks.
We prove that cellular automaton reservoir holds a distributed representation
of attribute statistics, which provides a more effective computation than local
representation. It is possible to estimate the kernel for linear cellular
automata via metric learning, that enables a much more efficient distance
computation in support vector machine framework. Also, binary reservoir feature
vectors can be combined using Boolean operations as in hyperdimensional
computing, paving a direct way for concept building and symbolic processing.Comment: Corrected Typos. Responded some comments on section 8. Added appendix
for details. Recurrent architecture emphasize
Computational Capacity and Energy Consumption of Complex Resistive Switch Networks
Resistive switches are a class of emerging nanoelectronics devices that
exhibit a wide variety of switching characteristics closely resembling
behaviors of biological synapses. Assembled into random networks, such
resistive switches produce emerging behaviors far more complex than that of
individual devices. This was previously demonstrated in simulations that
exploit information processing within these random networks to solve tasks that
require nonlinear computation as well as memory. Physical assemblies of such
networks manifest complex spatial structures and basic processing capabilities
often related to biologically-inspired computing. We model and simulate random
resistive switch networks and analyze their computational capacities. We
provide a detailed discussion of the relevant design parameters and establish
the link to the physical assemblies by relating the modeling parameters to
physical parameters. More globally connected networks and an increased network
switching activity are means to increase the computational capacity linearly at
the expense of exponentially growing energy consumption. We discuss a new
modular approach that exhibits higher computational capacities and energy
consumption growing linearly with the number of networks used. The results show
how to optimize the trade-off between computational capacity and energy
efficiency and are relevant for the design and fabrication of novel computing
architectures that harness random assemblies of emerging nanodevices
Nano-scale reservoir computing
This work describes preliminary steps towards nano-scale reservoir computing
using quantum dots. Our research has focused on the development of an
accumulator-based sensing system that reacts to changes in the environment, as
well as the development of a software simulation. The investigated systems
generate nonlinear responses to inputs that make them suitable for a physical
implementation of a neural network. This development will enable
miniaturisation of the neurons to the molecular level, leading to a range of
applications including monitoring of changes in materials or structures. The
system is based around the optical properties of quantum dots. The paper will
report on experimental work on systems using Cadmium Selenide (CdSe) quantum
dots and on the various methods to render the systems sensitive to pH, redox
potential or specific ion concentration. Once the quantum dot-based systems are
rendered sensitive to these triggers they can provide a distributed array that
can monitor and transmit information on changes within the material.Comment: 8 pages, 9 figures, accepted for publication in Nano Communication
Networks, http://www.journals.elsevier.com/nano-communication-networks/. An
earlier version was presented at the 3rd IEEE International Workshop on
Molecular and Nanoscale Communications (IEEE MoNaCom 2013
Potential implementation of Reservoir Computing models based on magnetic skyrmions
Reservoir Computing is a type of recursive neural network commonly used for
recognizing and predicting spatio-temporal events relying on a complex
hierarchy of nested feedback loops to generate a memory functionality. The
Reservoir Computing paradigm does not require any knowledge of the reservoir
topology or node weights for training purposes and can therefore utilize
naturally existing networks formed by a wide variety of physical processes.
Most efforts prior to this have focused on utilizing memristor techniques to
implement recursive neural networks. This paper examines the potential of
skyrmion fabrics formed in magnets with broken inversion symmetry that may
provide an attractive physical instantiation for Reservoir Computing.Comment: 11 pages, 3 figure
A general representation of dynamical systems for reservoir computing
Dynamical systems are capable of performing computation in a reservoir
computing paradigm. This paper presents a general representation of these
systems as an artificial neural network (ANN). Initially, we implement the
simplest dynamical system, a cellular automaton. The mathematical fundamentals
behind an ANN are maintained, but the weights of the connections and the
activation function are adjusted to work as an update rule in the context of
cellular automata. The advantages of such implementation are its usage on
specialized and optimized deep learning libraries, the capabilities to
generalize it to other types of networks and the possibility to evolve cellular
automata and other dynamical systems in terms of connectivity, update and
learning rules. Our implementation of cellular automata constitutes an initial
step towards a general framework for dynamical systems. It aims to evolve such
systems to optimize their usage in reservoir computing and to model physical
computing substrates.Comment: 5 pages, 3 figures, accepted workshop paper at Workshop on Novel
Substrates and Models for the Emergence of Developmental, Learning and
Cognitive Capabilities at IEEE ICDL-EPIROB 201
- …