12,981 research outputs found

    Neuroinspired unsupervised learning and pruning with subquantum CBRAM arrays.

    Get PDF
    Resistive RAM crossbar arrays offer an attractive solution to minimize off-chip data transfer and parallelize on-chip computations for neural networks. Here, we report a hardware/software co-design approach based on low energy subquantum conductive bridging RAM (CBRAM®) devices and a network pruning technique to reduce network level energy consumption. First, we demonstrate low energy subquantum CBRAM devices exhibiting gradual switching characteristics important for implementing weight updates in hardware during unsupervised learning. Then we develop a network pruning algorithm that can be employed during training, different from previous network pruning approaches applied for inference only. Using a 512 kbit subquantum CBRAM array, we experimentally demonstrate high recognition accuracy on the MNIST dataset for digital implementation of unsupervised learning. Our hardware/software co-design approach can pave the way towards resistive memory based neuro-inspired systems that can autonomously learn and process information in power-limited settings

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    Closing the loop between neural network simulators and the OpenAI Gym

    Full text link
    Since the enormous breakthroughs in machine learning over the last decade, functional neural network models are of growing interest for many researchers in the field of computational neuroscience. One major branch of research is concerned with biologically plausible implementations of reinforcement learning, with a variety of different models developed over the recent years. However, most studies in this area are conducted with custom simulation scripts and manually implemented tasks. This makes it hard for other researchers to reproduce and build upon previous work and nearly impossible to compare the performance of different learning architectures. In this work, we present a novel approach to solve this problem, connecting benchmark tools from the field of machine learning and state-of-the-art neural network simulators from computational neuroscience. This toolchain enables researchers in both fields to make use of well-tested high-performance simulation software supporting biologically plausible neuron, synapse and network models and allows them to evaluate and compare their approach on the basis of standardized environments of varying complexity. We demonstrate the functionality of the toolchain by implementing a neuronal actor-critic architecture for reinforcement learning in the NEST simulator and successfully training it on two different environments from the OpenAI Gym

    Monitoring and Control of Hydrocyclones by Use of Convolutional Neural Networks and Deep Reinforcement Learning

    Get PDF
    The use of convolutional neural networks for monitoring hydrocyclones from underflow images was investigated. Proof-of-concept and applied industrial considerations for hydrocyclone state detection and underflow particle size inference sensors were demonstrated. The behaviour and practical considerations of model-free reinforcement learning, incorporating the additional information provided by the sensors developed, was also discussed in a mineral processing context

    An Early History of Optimization Technology for Automated Design of Microwave Circuits

    Get PDF
    This paper outlines the early history of optimization technology for the design of microwave circuits—a personal journey filled with aspirations, academic contributions, and commercial innovations. Microwave engineers have evolved from being consumers of mathematical optimization algorithms to originators of exciting concepts and technologies that have spread far beyond the boundaries of microwaves. From the early days of simple direct search algorithms based on heuristic methods through gradient-based electromagnetic optimization to space mapping technology we arrive at today’s surrogate methodologies. Our path finally connects to today’s multi-physics, system-level, and measurement-based optimization challenges exploiting confined and feature-based surrogates, cognition-driven space mapping, Bayesian approaches, and more. Our story recognizes visionaries such as William J. Getsinger of the 1960s and Robert Pucel of the 1980s, and highlights a seminal decades-long collaboration with mathematician Kaj Madsen. We address not only academic contributions that provide proof of concept, but also indicate early formative milestones in the development of commercially competitive software specifically featuring optimization technology.ITESO, A.C
    • …
    corecore