research

Spatio-temporal Learning with Arrays of Analog Nanosynapses

Abstract

Emerging nanodevices such as resistive memories are being considered for hardware realizations of a variety of artificial neural networks (ANNs), including highly promising online variants of the learning approaches known as reservoir computing (RC) and the extreme learning machine (ELM). We propose an RC/ELM inspired learning system built with nanosynapses that performs both on-chip projection and regression operations. To address time-dynamic tasks, the hidden neurons of our system perform spatio-temporal integration and can be further enhanced with variable sampling or multiple activation windows. We detail the system and show its use in conjunction with a highly analog nanosynapse device on a standard task with intrinsic timing dynamics- the TI-46 battery of spoken digits. The system achieves nearly perfect (99%) accuracy at sufficient hidden layer size, which compares favorably with software results. In addition, the model is extended to a larger dataset, the MNIST database of handwritten digits. By translating the database into the time domain and using variable integration windows, up to 95% classification accuracy is achieved. In addition to an intrinsically low-power programming style, the proposed architecture learns very quickly and can easily be converted into a spiking system with negligible loss in performance- all features that confer significant energy efficiency.Comment: 6 pages, 3 figures. Presented at 2017 IEEE/ACM Symposium on Nanoscale architectures (NANOARCH

    Similar works

    Full text

    thumbnail-image

    Available Versions