188,744 research outputs found

    Linear Memory Networks

    Full text link
    Recurrent neural networks can learn complex transduction problems that require maintaining and actively exploiting a memory of their inputs. Such models traditionally consider memory and input-output functionalities indissolubly entangled. We introduce a novel recurrent architecture based on the conceptual separation between the functional input-output transformation and the memory mechanism, showing how they can be implemented through different neural components. By building on such conceptualization, we introduce the Linear Memory Network, a recurrent model comprising a feedforward neural network, realizing the non-linear functional transformation, and a linear autoencoder for sequences, implementing the memory component. The resulting architecture can be efficiently trained by building on closed-form solutions to linear optimization problems. Further, by exploiting equivalence results between feedforward and recurrent neural networks we devise a pretraining schema for the proposed architecture. Experiments on polyphonic music datasets show competitive results against gated recurrent networks and other state of the art models

    Analog Neural Programmable Optimizers in CMOS VLSI Technologies

    Get PDF
    A 3-μm CMOS IC is presented demonstrating the concept of an analog neural system for constrained optimization. A serial time-multiplexed general-purpose architecture is introduced for the real-time solution of this kind of problem in MOS VLSI. This architecture is a fully programmable and reconfigurable one exploiting SC techniques for the analog part and making extensive use of digital techniques for programmability
    corecore