3 research outputs found

    Hybrid Physics-informed Neural Networks for Dynamical Systems

    Get PDF
    Ordinary differential equations can describe many dynamic systems. When physics is well understood, the time-dependent responses are easily obtained numerically. The particular numerical method used for integration depends on the application. Unfortunately, when physics is not fully understood, the discrepancies between predictions and observed responses can be large and unacceptable. In this thesis, we show how to directly implement integration of ordinary differential equations through recurrent neural networks using Python. We leveraged modern machine learning frameworks, such as TensorFlow and Keras. Besides offering basic models capabilities (such as multilayer perceptrons and recurrent neural networks) and optimization methods, these frameworks offer powerful automatic differentiation. With that, our approach\u27s main advantage is that one can implement hybrid models combining physics-informed and data-driven kernels, where data-driven kernels are used to reduce the gap between predictions and observations. In order to illustrate our approach, we used two case studies. The first one consisted of performing fatigue crack growth integration through Euler\u27s forward method using a hybrid model combining a data-driven stress intensity range model with a physics-based crack length increment model. The second case study consisted of performing model parameter identification of a dynamic two-degree-of-freedom system through Runge-Kutta integration. Additionally, we performed a numerical experiment for fleet prognosis with hybrid models. The problem consists of predicting fatigue crack length for a fleet of aircraft. The hybrid models are trained using full input observations (far-field loads) and very limited output observations (crack length data for only a portion of the fleet). The results demonstrate that our proposed physics-informed recurrent neural network can model fatigue crack growth even when the observed distribution of crack length does not match the fleet distribution

    ForeNet: fourier recurrent neural networks for time series prediction.

    Get PDF
    Ying-Qian Zhang.Thesis (M.Phil.)--Chinese University of Hong Kong, 2001.Includes bibliographical references (leaves 115-124).Abstracts in English and Chinese.Abstract --- p.iAcknowledgement --- p.iiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Background --- p.1Chapter 1.2 --- Objective --- p.2Chapter 1.3 --- Contributions --- p.3Chapter 1.4 --- Thesis Overview --- p.4Chapter 2 --- Literature Review --- p.6Chapter 2.1 --- Takens' Theorem --- p.6Chapter 2.2 --- Linear Models for Prediction --- p.7Chapter 2.2.1 --- Autoregressive Model --- p.7Chapter 2.2.2 --- Moving Average Model --- p.8Chapter 2.2.3 --- Autoregressive-moving Average Model --- p.9Chapter 2.2.4 --- Fitting a Linear Model to a Given Time Series --- p.9Chapter 2.2.5 --- State-space Reconstruction --- p.10Chapter 2.3 --- Neural Network Models for Time Series Processing --- p.11Chapter 2.3.1 --- Feed-forward Neural Networks --- p.11Chapter 2.3.2 --- Recurrent Neural Networks --- p.14Chapter 2.3.3 --- Training Algorithms for Recurrent Networks --- p.18Chapter 2.4 --- Combining Neural Networks and other approximation techniques --- p.22Chapter 3 --- ForeNet: Model and Representation --- p.24Chapter 3.1 --- Fourier Recursive Prediction Equation --- p.24Chapter 3.1.1 --- Fourier Analysis of Time Series --- p.25Chapter 3.1.2 --- Recursive Form --- p.25Chapter 3.2 --- Fourier Recurrent Neural Network Model (ForeNet) --- p.27Chapter 3.2.1 --- Neural Networks Representation --- p.28Chapter 3.2.2 --- Architecture of ForeNet --- p.29Chapter 4 --- ForeNet: Implementation --- p.32Chapter 4.1 --- Improvement on ForeNet --- p.33Chapter 4.1.1 --- Number of Hidden Neurons --- p.33Chapter 4.1.2 --- Real-valued Outputs --- p.34Chapter 4.2 --- Parameters Initialization --- p.37Chapter 4.3 --- Application of ForeNet: the Process of Time Series Prediction --- p.38Chapter 4.4 --- Some Implications --- p.39Chapter 5 --- ForeNet: Initialization --- p.40Chapter 5.1 --- Unfolded Form of ForeNet --- p.40Chapter 5.2 --- Coefficients Analysis --- p.43Chapter 5.2.1 --- "Analysis of the Coefficients Set, vn " --- p.43Chapter 5.2.2 --- "Analysis of the Coefficients Set, μn(d) " --- p.44Chapter 5.3 --- Experiments of ForeNet Initialization --- p.47Chapter 5.3.1 --- Objective and Experiment Setting --- p.47Chapter 5.3.2 --- Prediction of Sunspot Series --- p.49Chapter 5.3.3 --- Prediction of Mackey-Glass Series --- p.53Chapter 5.3.4 --- Prediction of Laser Data --- p.56Chapter 5.3.5 --- Three More Series --- p.59Chapter 5.4 --- Some Implications on the Proposed Initialization Method --- p.63Chapter 6 --- ForeNet: Learning Algorithms --- p.67Chapter 6.1 --- Complex Real Time Recurrent Learning (CRTRL) --- p.68Chapter 6.2 --- Batch-mode Learning --- p.70Chapter 6.3 --- Time Complexity --- p.71Chapter 6.4 --- Property Analysis and Experimental Results --- p.72Chapter 6.4.1 --- Efficient initialization:compared with random initialization --- p.74Chapter 6.4.2 --- Complex-valued network:compared with real-valued net- work --- p.78Chapter 6.4.3 --- Simple architecture:compared with ring-structure RNN . --- p.79Chapter 6.4.4 --- Linear model: compared with nonlinear ForeNet --- p.80Chapter 6.4.5 --- Small number of hidden units --- p.88Chapter 6.5 --- Comparison with Some Other Models --- p.89Chapter 6.5.1 --- Comparison with AR model --- p.91Chapter 6.5.2 --- Comparison with TDNN Networks and FIR Networks . --- p.93Chapter 6.5.3 --- Comparison to a few more results --- p.94Chapter 6.6 --- Summarization --- p.95Chapter 7 --- Learning and Prediction: On-Line Training --- p.98Chapter 7.1 --- On-Line Learning Algorithm --- p.98Chapter 7.1.1 --- Advantages and Disadvantages --- p.98Chapter 7.1.2 --- Training Process --- p.99Chapter 7.2 --- Experiments --- p.101Chapter 7.3 --- Predicting Stock Time Series --- p.105Chapter 8 --- Discussions and Conclusions --- p.109Chapter 8.1 --- Limitations of ForeNet --- p.109Chapter 8.2 --- Advantages of ForeNet --- p.111Chapter 8.3 --- Future Works --- p.112Bibliography --- p.11
    corecore