5,789 research outputs found

    Fleet Prognosis with Physics-informed Recurrent Neural Networks

    Full text link
    Services and warranties of large fleets of engineering assets is a very profitable business. The success of companies in that area is often related to predictive maintenance driven by advanced analytics. Therefore, accurate modeling, as a way to understand how the complex interactions between operating conditions and component capability define useful life, is key for services profitability. Unfortunately, building prognosis models for large fleets is a daunting task as factors such as duty cycle variation, harsh environments, inadequate maintenance, and problems with mass production can lead to large discrepancies between designed and observed useful lives. This paper introduces a novel physics-informed neural network approach to prognosis by extending recurrent neural networks to cumulative damage models. We propose a new recurrent neural network cell designed to merge physics-informed and data-driven layers. With that, engineers and scientists have the chance to use physics-informed layers to model parts that are well understood (e.g., fatigue crack growth) and use data-driven layers to model parts that are poorly characterized (e.g., internal loads). A simple numerical experiment is used to present the main features of the proposed physics-informed recurrent neural network for damage accumulation. The test problem consist of predicting fatigue crack length for a synthetic fleet of airplanes subject to different mission mixes. The model is trained using full observation inputs (far-field loads) and very limited observation of outputs (crack length at inspection for only a portion of the fleet). The results demonstrate that our proposed hybrid physics-informed recurrent neural network is able to accurately model fatigue crack growth even when the observed distribution of crack length does not match with the (unobservable) fleet distribution.Comment: Data and codes (including our implementation for both the multi-layer perceptron, the stress intensity and Paris law layers, the cumulative damage cell, as well as python driver scripts) used in this manuscript are publicly available on GitHub at https://github.com/PML-UCF/pinn. The data and code are released under the MIT Licens

    Finite Size Effects in Separable Recurrent Neural Networks

    Full text link
    We perform a systematic analytical study of finite size effects in separable recurrent neural network models with sequential dynamics, away from saturation. We find two types of finite size effects: thermal fluctuations, and disorder-induced `frozen' corrections to the mean-field laws. The finite size effects are described by equations that correspond to a time-dependent Ornstein-Uhlenbeck process. We show how the theory can be used to understand and quantify various finite size phenomena in recurrent neural networks, with and without detailed balance.Comment: 24 pages LaTex, with 4 postscript figures include

    Avaliação de cultivares de milho e sorgo para produção de forragem.

    Get PDF
    bitstream/item/25659/1/Com-94.pd
    corecore