499 research outputs found

    Experimental study of artificial neural networks using a digital memristor simulator

    Get PDF
    © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.This paper presents a fully digital implementation of a memristor hardware simulator, as the core of an emulator, based on a behavioral model of voltage-controlled threshold-type bipolar memristors. Compared to other analog solutions, the proposed digital design is compact, easily reconfigurable, demonstrates very good matching with the mathematical model on which it is based, and complies with all the required features for memristor emulators. We validated its functionality using Altera Quartus II and ModelSim tools targeting low-cost yet powerful field programmable gate array (FPGA) families. We tested its suitability for complex memristive circuits as well as its synapse functioning in artificial neural networks (ANNs), implementing examples of associative memory and unsupervised learning of spatio-temporal correlations in parallel input streams using a simplified STDP. We provide the full circuit schematics of all our digital circuit designs and comment on the required hardware resources and their scaling trends, thus presenting a design framework for applications based on our hardware simulator.Peer ReviewedPostprint (author's final draft

    Memristors for the Curious Outsiders

    Full text link
    We present both an overview and a perspective of recent experimental advances and proposed new approaches to performing computation using memristors. A memristor is a 2-terminal passive component with a dynamic resistance depending on an internal parameter. We provide an brief historical introduction, as well as an overview over the physical mechanism that lead to memristive behavior. This review is meant to guide nonpractitioners in the field of memristive circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page

    An Attractor-Based Complexity Measurement for Boolean Recurrent Neural Networks

    Get PDF
    We provide a novel refined attractor-based complexity measurement for Boolean recurrent neural networks that represents an assessment of their computational power in terms of the significance of their attractor dynamics. This complexity measurement is achieved by first proving a computational equivalence between Boolean recurrent neural networks and some specific class of -automata, and then translating the most refined classification of -automata to the Boolean neural network context. As a result, a hierarchical classification of Boolean neural networks based on their attractive dynamics is obtained, thus providing a novel refined attractor-based complexity measurement for Boolean recurrent neural networks. These results provide new theoretical insights to the computational and dynamical capabilities of neural networks according to their attractive potentialities. An application of our findings is illustrated by the analysis of the dynamics of a simplified model of the basal ganglia-thalamocortical network simulated by a Boolean recurrent neural network. This example shows the significance of measuring network complexity, and how our results bear new founding elements for the understanding of the complexity of real brain circuits

    Toward bio-inspired information processing with networks of nano-scale switching elements

    Full text link
    Unconventional computing explores multi-scale platforms connecting molecular-scale devices into networks for the development of scalable neuromorphic architectures, often based on new materials and components with new functionalities. We review some work investigating the functionalities of locally connected networks of different types of switching elements as computational substrates. In particular, we discuss reservoir computing with networks of nonlinear nanoscale components. In usual neuromorphic paradigms, the network synaptic weights are adjusted as a result of a training/learning process. In reservoir computing, the non-linear network acts as a dynamical system mixing and spreading the input signals over a large state space, and only a readout layer is trained. We illustrate the most important concepts with a few examples, featuring memristor networks with time-dependent and history dependent resistances
    corecore