15 research outputs found

    Accelerated neuromorphic cybernetics

    Get PDF
    Accelerated mixed-signal neuromorphic hardware refers to electronic systems that emulate electrophysiological aspects of biological nervous systems in analog voltages and currents in an accelerated manner. While the functional spectrum of these systems already includes many observed neuronal capabilities, such as learning or classification, some areas remain largely unexplored. In particular, this concerns cybernetic scenarios in which nervous systems engage in closed interaction with their bodies and environments. Since the control of behavior and movement in animals is both the purpose and the cause of the development of nervous systems, such processes are, however, of essential importance in nature. Besides the design of neuromorphic circuit- and system components, the main focus of this work is therefore the construction and analysis of accelerated neuromorphic agents that are integrated into cybernetic chains of action. These agents are, on the one hand, an accelerated mechanical robot, on the other hand, an accelerated virtual insect. In both cases, the sensory organs and actuators of their artificial bodies are derived from the neurophysiology of the biological prototypes and are reproduced as faithfully as possible. In addition, each of the two biomimetic organisms is subjected to evolutionary optimization, which illustrates the advantages of accelerated neuromorphic nervous systems through significant time savings

    Structural plasticity on an accelerated analog neuromorphic hardware system

    Get PDF
    In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations. Their neural connectivity and synaptic capacity depends on their specific design choices, but is always intrinsically limited. Here, we present a strategy to achieve structural plasticity that optimizes resource allocation under these constraints by constantly rewiring the pre- and gpostsynaptic partners while keeping the neuronal fan-in constant and the connectome sparse. In particular, we implemented this algorithm on the analog neuromorphic system BrainScaleS-2. It was executed on a custom embedded digital processor located on chip, accompanying the mixed-signal substrate of spiking neurons and synapse circuits. We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology with respect to the nature of its training data, as well as its overall computational efficiency

    Demonstrating Advantages of Neuromorphic Computation: A Pilot Study

    Get PDF
    Neuromorphic devices represent an attempt to mimic aspects of the brain's architecture and dynamics with the aim of replicating its hallmark functional capabilities in terms of computational power, robust learning and energy efficiency. We employ a single-chip prototype of the BrainScaleS 2 neuromorphic system to implement a proof-of-concept demonstration of reward-modulated spike-timing-dependent plasticity in a spiking network that learns to play the Pong video game by smooth pursuit. This system combines an electronic mixed-signal substrate for emulating neuron and synapse dynamics with an embedded digital processor for on-chip learning, which in this work also serves to simulate the virtual environment and learning agent. The analog emulation of neuronal membrane dynamics enables a 1000-fold acceleration with respect to biological real-time, with the entire chip operating on a power budget of 57mW. Compared to an equivalent simulation using state-of-the-art software, the on-chip emulation is at least one order of magnitude faster and three orders of magnitude more energy-efficient. We demonstrate how on-chip learning can mitigate the effects of fixed-pattern noise, which is unavoidable in analog substrates, while making use of temporal variability for action exploration. Learning compensates imperfections of the physical substrate, as manifested in neuronal parameter variability, by adapting synaptic weights to match respective excitability of individual neurons.Comment: Added measurements with noise in NEST simulation, add notice about journal publication. Frontiers in Neuromorphic Engineering (2019

    Inference with Artificial Neural Networks on Analog Neuromorphic Hardware

    Full text link
    The neuromorphic BrainScaleS-2 ASIC comprises mixed-signal neurons and synapse circuits as well as two versatile digital microprocessors. Primarily designed to emulate spiking neural networks, the system can also operate in a vector-matrix multiplication and accumulation mode for artificial neural networks. Analog multiplication is carried out in the synapse circuits, while the results are accumulated on the neurons' membrane capacitors. Designed as an analog, in-memory computing device, it promises high energy efficiency. Fixed-pattern noise and trial-to-trial variations, however, require the implemented networks to cope with a certain level of perturbations. Further limitations are imposed by the digital resolution of the input values (5 bit), matrix weights (6 bit) and resulting neuron activations (8 bit). In this paper, we discuss BrainScaleS-2 as an analog inference accelerator and present calibration as well as optimization strategies, highlighting the advantages of training with hardware in the loop. Among other benchmarks, we classify the MNIST handwritten digits dataset using a two-dimensional convolution and two dense layers. We reach 98.0% test accuracy, closely matching the performance of the same network evaluated in software

    Phase Correlations in Turbulent and Unstable Particle Trajectories

    Get PDF
    Phase Walk Analysis offers a novel way to trace and quantify nonlinearities and/or turbulent infulence in dusty plasma particle trajectories. In the present work, this is demonstrated on simulated as well as experimental data

    Phase correlations in leptokurtic time series

    No full text
    Many time series of nonlinear origin, like turbulence or financial markets not only exhibit lep- tokurtic (also called heavy-tailed or fat-tailed) probability distribution functions (PDF) but are also classified or characterized by them. In the Fourier representation of these time series these non- linearities must be contained in the phase information. Here, we demonstrate how to exploit this information to both synthesize and analyze leptokurtic time series. We first show that empirical data of market indices show linear phase correlations. Based on these findings we impose a set of linear phase correlations on white Gaussian noise and demonstrate that the so-obtained time series can reproduce the scaling properties of the PDF and volatility remark- ably well [1]. Next we introduce a novel nonlinear statistic called phase walk statistics (PWS), which measures – in close analogy to random walk analyses – the deviation from randomness of the phases. Applying this statistics to a number of empirical and synthetic leptokurtic time series reveals that nonlinear- ities can be detected unprecedented significance. A surrogate-assisted analysis further shows that PWS is able to disentangle different classes of nonlinearities, which have indistinguishable PDFs [2]. In summary, the study of Fourier phases offers new and refined insights into nonlinearities in time series. Thus, this approach opens new ways for better understanding and modelling the underlying nonlinear processes

    Phase correlations in nonlinear time series analysis

    No full text
    Nonlinear times series properties are represented by the Fourier phases of the time series and for a comprehensive description of nonlinear data it is necessary to include the quantification of Fourier phase information. We introduce a novel method that traces nonlinearities by detecting correlations among the Fourier phases. The basic idea of the so-called phase walk statistics (PWS) is to measure - in close analogy to random walks - the deviation from randomness of the unwrapped differences of phases. The method compares distributions of phases from experimental and simulated time series with distributions that follow the null hypothesis of having totally random and uniformly distributed steps. It is shown that the obtained signifcances of nonlinear influence are comparable to those of well established measures. However, the results are obtained with orders of magnitude less computational effort, enabling much refined statistical analyses. We point out that the method can even identify specific scales on which nonlinear effects occur - a feature, not yet reached by any conventional method. A surrogate assisted analysis further shows that PWS is able to disentangle different classes of nonlinearities which have indistinguishable PDFs. The presented algorithms can be used to quantify nonlinear influence on time series of any physical, biological or financial system. We demonstrate the performance of the new approach via examples from Econophysics (Dow Jones time series), Plasmaphysics (trajectories of dust particles in vortices) and Aerodynamics (wind data with intermittencies)
    corecore