17 research outputs found
Accelerated neuromorphic cybernetics
Accelerated mixed-signal neuromorphic hardware refers to electronic systems that emulate electrophysiological aspects of biological nervous systems in analog voltages and currents in an accelerated manner. While the functional spectrum of these systems already includes many observed neuronal capabilities, such as learning or classification, some areas remain largely unexplored. In particular, this concerns cybernetic scenarios in which nervous systems engage in closed interaction with their bodies and environments. Since the control of behavior and movement in animals is both the purpose and the cause of the development of nervous systems, such processes are, however, of essential importance in nature. Besides the design of neuromorphic circuit- and system components, the main focus of this work is therefore the construction and analysis of accelerated neuromorphic agents that are integrated into cybernetic chains of action. These agents are, on the one hand, an accelerated mechanical robot, on the other hand, an accelerated virtual insect. In both cases, the sensory organs and actuators of their artificial bodies are derived from the neurophysiology of the biological prototypes and are reproduced as faithfully as possible. In addition, each of the two biomimetic organisms is subjected to evolutionary optimization, which illustrates the advantages of accelerated neuromorphic nervous systems through significant time savings
Structural plasticity on an accelerated analog neuromorphic hardware system
In computational neuroscience, as well as in machine learning, neuromorphic
devices promise an accelerated and scalable alternative to neural network
simulations. Their neural connectivity and synaptic capacity depends on their
specific design choices, but is always intrinsically limited. Here, we present
a strategy to achieve structural plasticity that optimizes resource allocation
under these constraints by constantly rewiring the pre- and gpostsynaptic
partners while keeping the neuronal fan-in constant and the connectome sparse.
In particular, we implemented this algorithm on the analog neuromorphic system
BrainScaleS-2. It was executed on a custom embedded digital processor located
on chip, accompanying the mixed-signal substrate of spiking neurons and synapse
circuits. We evaluated our implementation in a simple supervised learning
scenario, showing its ability to optimize the network topology with respect to
the nature of its training data, as well as its overall computational
efficiency
Demonstrating Advantages of Neuromorphic Computation: A Pilot Study
Neuromorphic devices represent an attempt to mimic aspects of the brain's
architecture and dynamics with the aim of replicating its hallmark functional
capabilities in terms of computational power, robust learning and energy
efficiency. We employ a single-chip prototype of the BrainScaleS 2 neuromorphic
system to implement a proof-of-concept demonstration of reward-modulated
spike-timing-dependent plasticity in a spiking network that learns to play the
Pong video game by smooth pursuit. This system combines an electronic
mixed-signal substrate for emulating neuron and synapse dynamics with an
embedded digital processor for on-chip learning, which in this work also serves
to simulate the virtual environment and learning agent. The analog emulation of
neuronal membrane dynamics enables a 1000-fold acceleration with respect to
biological real-time, with the entire chip operating on a power budget of 57mW.
Compared to an equivalent simulation using state-of-the-art software, the
on-chip emulation is at least one order of magnitude faster and three orders of
magnitude more energy-efficient. We demonstrate how on-chip learning can
mitigate the effects of fixed-pattern noise, which is unavoidable in analog
substrates, while making use of temporal variability for action exploration.
Learning compensates imperfections of the physical substrate, as manifested in
neuronal parameter variability, by adapting synaptic weights to match
respective excitability of individual neurons.Comment: Added measurements with noise in NEST simulation, add notice about
journal publication. Frontiers in Neuromorphic Engineering (2019
Inference with Artificial Neural Networks on Analog Neuromorphic Hardware
The neuromorphic BrainScaleS-2 ASIC comprises mixed-signal neurons and
synapse circuits as well as two versatile digital microprocessors. Primarily
designed to emulate spiking neural networks, the system can also operate in a
vector-matrix multiplication and accumulation mode for artificial neural
networks. Analog multiplication is carried out in the synapse circuits, while
the results are accumulated on the neurons' membrane capacitors. Designed as an
analog, in-memory computing device, it promises high energy efficiency.
Fixed-pattern noise and trial-to-trial variations, however, require the
implemented networks to cope with a certain level of perturbations. Further
limitations are imposed by the digital resolution of the input values (5 bit),
matrix weights (6 bit) and resulting neuron activations (8 bit). In this paper,
we discuss BrainScaleS-2 as an analog inference accelerator and present
calibration as well as optimization strategies, highlighting the advantages of
training with hardware in the loop. Among other benchmarks, we classify the
MNIST handwritten digits dataset using a two-dimensional convolution and two
dense layers. We reach 98.0% test accuracy, closely matching the performance of
the same network evaluated in software
Versatile emulation of spiking neural networks on an accelerated neuromorphic substrate
We present first experimental results on the novel BrainScaleS-2 neuromorphic
architecture based on an analog neuro-synaptic core and augmented by embedded
microprocessors for complex plasticity and experiment control. The high
acceleration factor of 1000 compared to biological dynamics enables the
execution of computationally expensive tasks, by allowing the fast emulation of
long-duration experiments or rapid iteration over many consecutive trials. The
flexibility of our architecture is demonstrated in a suite of five distinct
experiments, which emphasize different aspects of the BrainScaleS-2 system
Phase Correlations in Turbulent and Unstable Particle Trajectories
Phase Walk Analysis offers a novel way to trace and quantify nonlinearities and/or turbulent infulence in dusty plasma particle trajectories. In the present work, this is demonstrated on simulated as well as experimental data
Phase correlations in leptokurtic time series
Many time series of nonlinear origin, like turbulence or financial markets not only exhibit lep-
tokurtic (also called heavy-tailed or fat-tailed) probability distribution functions (PDF) but are also
classified or characterized by them. In the Fourier representation of these time series these non-
linearities must be contained in the phase information. Here, we demonstrate how to exploit this
information to both synthesize and analyze leptokurtic time series.
We first show that empirical data of market indices show linear phase correlations. Based on these
findings we impose a set of linear phase correlations on white Gaussian noise and demonstrate that
the so-obtained time series can reproduce the scaling properties of the PDF and volatility remark-
ably well [1].
Next we introduce a novel nonlinear statistic called phase walk statistics (PWS), which measures –
in close analogy to random walk analyses – the deviation from randomness of the phases. Applying
this statistics to a number of empirical and synthetic leptokurtic time series reveals that nonlinear-
ities can be detected unprecedented significance. A surrogate-assisted analysis further shows that
PWS is able to disentangle different classes of nonlinearities, which have indistinguishable PDFs [2].
In summary, the study of Fourier phases offers new and refined insights into nonlinearities in time
series. Thus, this approach opens new ways for better understanding and modelling the underlying
nonlinear processes