2 research outputs found

    Predicting Sequential Data with LSTMs Augmented with Strictly 2-Piecewise Input Vectors

    Get PDF
    Abstract Recurrent neural networks such as Long-Short Term Memory (LSTM) are often used to learn from various kinds of time-series data, especially those that involved long-distance dependencies. We introduce a vector representation for the Strictly 2-Piecewise (SP-2) formal languages, which encode certain kinds of long-distance dependencies using subsequences. These vectors are added to the LSTM architecture as an additional input. Through experiments with the problems in the SPiCe datase
    corecore