1,312 research outputs found

    A space-time neural network

    Get PDF
    Introduced here is a novel technique which adds the dimension of time to the well known back propagation neural network algorithm. Cited here are several reasons why the inclusion of automated spatial and temporal associations are crucial to effective systems modeling. An overview of other works which also model spatiotemporal dynamics is furnished. A detailed description is given of the processes necessary to implement the space-time network algorithm. Several demonstrations that illustrate the capabilities and performance of this new architecture are given

    Incremental construction of LSTM recurrent neural network

    Get PDF
    Long Short--Term Memory (LSTM) is a recurrent neural network that uses structures called memory blocks to allow the net remember significant events distant in the past input sequence in order to solve long time lag tasks, where other RNN approaches fail. Throughout this work we have performed experiments using LSTM networks extended with growing abilities, which we call GLSTM. Four methods of training growing LSTM has been compared. These methods include cascade and fully connected hidden layers as well as two different levels of freezing previous weights in the cascade case. GLSTM has been applied to a forecasting problem in a biomedical domain, where the input/output behavior of five controllers of the Central Nervous System control has to be modelled. We have compared growing LSTM results against other neural networks approaches, and our work applying conventional LSTM to the task at hand.Postprint (published version

    Multiresolution FIR neural-network-based learning algorithm applied to network traffic prediction

    No full text
    Published versio

    Learning characteristics of a space-time neural network as a tether skiprope observer

    Get PDF
    The Software Technology Laboratory at the Johnson Space Center is testing a Space Time Neural Network (STNN) for observing tether oscillations present during retrieval of a tethered satellite. Proper identification of tether oscillations, known as 'skiprope' motion, is vital to safe retrieval of the tethered satellite. Our studies indicate that STNN has certain learning characteristics that must be understood properly to utilize this type of neural network for the tethered satellite problem. We present our findings on the learning characteristics including a learning rate versus momentum performance table

    Advanced Methods for Time Series Prediction Using Recurrent Neural Networks

    Get PDF
    Recurrent Neural Networks for Temporal Data Processing, Intech, pp. 15-36, ISBN 978-953-307-685-0

    Performance prediction of the full-scale bardenpho process using a genetic adapted time-delay neural network (GATDNN)

    Get PDF
    Wastewater treatment systems are characterized by large temporal variability of inflow, variable concentrations of components in the incoming wastewater to the plant, and highly variable biological reactions within the process. The behavior of observed process variables within a wastewater treatment plant (WWTP at a certain time instant is the combined effect of various processes initiated at different moments in the past. This is called a time-delay effect in the system. Due to the nature of strong nonlinear mapping, neural networks provide advantages as a modeling and identification tool over a structure-based model. However, the determination of the architecture of the artificial neural networks (ANNs) and the selection of key input variables with a time delay is not easy. in our research, a genetic adapted time-delay neural network (GATDNN), which is a combination of time-delay neural network(TDNN) and genetic algorithms(GAs), was developed and applied to the full-scale Bardenpho advanced sewage treatment process. In a GATDNN, a three-step modelling procedure was performed: (1) selection of significant input variables to maximise the predictive accuracy for each specific output; (2) finding a suitable network topology for the ANN-based process estimator; (3) sensitivity analysis. The results demonstrate that the modelling technique presented using a GATDNN provides a valuable tool for predicting the outputs with high levels of accuracy and identifying key operating variables. This work will permit the development of a reliable control strategy thus reducing the burden of the process engineer

    A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks

    Get PDF
    A new adaptive backpropagation (BP) algorithm based on Lyapunov stability theory for neural networks is developed in this paper. It is shown that the candidate of a Lyapunov function V(k) of the tracking error between the output of a neural network and the desired reference signal is chosen first, and the weights of the neural network are then updated, from the output layer to the input layer, in the sense that DeltaV(k)=V(k)-V(k-1)<0. The output tracking error can then asymptotically converge to zero according to Lyapunov stability theory. Unlike gradient-based BP training algorithms, the new Lyapunov adaptive BP algorithm in this paper is not used for searching the global minimum point along the cost-function surface in the weight space, but it is aimed at constructing an energy surface with a single global minimum point through the adaptive adjustment of the weights as the time goes to infinity. Although a neural network may have bounded input disturbances, the effects of the disturbances can be eliminated, and asymptotic error convergence can be obtained. The new Lyapunov adaptive BP algorithm is then applied to the design of an adaptive filter in the simulation example to show the fast error convergence and strong robustness with respect to large bounded input disturbance
    corecore