146 research outputs found

    Dynamic surrogate modelling for multistep-ahead prediction of multivariate nonlinear chemical processes

    Get PDF
    This work proposes a methodology for multivariate dynamic modeling and multistep-ahead prediction of nonlinear systems using surrogate models for the application to nonlinear chemical processes. The methodology provides a systematic and robust procedure for the development of data-driven dynamic models capable of predicting the process outputs over long time horizons. It is based on using surrogate models to construct several nonlinear autoregressive exogenous models (NARX) with each one approximating the future behavior of one process output as a function of the current and previous process inputs and outputs. The developed dynamic models are employed in a recursive schema to predict the process future outputs over several time steps (multistep-ahead prediction). The methodology is able to manage two different scenarios: (1) one in which a set of input–output signals collected from the process is only available for training and (2) another in which a mathematical model of the process is available and can be used to generate specific datasets for training. With respect to the latter, the proposed methodology includes a specific procedure for the selection of training data in dynamic modeling based on design of computer experiment (DOCE) techniques. The proposed methodology is applied to case studies from the process industry presented in the literature. The results show very high prediction accuracies over long time horizons. Also, owing to the flexibility, robustness, and computational efficiency of surrogate modeling, the methodology allows dealing with a wide range of situations, which would be difficult to address using first-principles models.Peer ReviewedPostprint (author's final draft

    Development of reduced polynomial chaos-Kriging metamodel for uncertainty quantification of computational aerodynamics

    Get PDF
    2018 Summer.Includes bibliographical references.Computational fluid dynamics (CFD) simulations are a critical component of the design and development of aerodynamic bodies. However, as engineers attempt to capture more detailed physics, the computational cost of simulations increases. This limits the ability of engineers to use robust or multidisciplinary design methodologies for practical engineering applications because the computational model is too expensive to evaluate for uncertainty quantification studies and off-design performance analysis. Metamodels (surrogate models) are a closed-form mathematical solution fit to only a few simulation responses which can be used to remedy this situation by estimating off-design performance and stochastic responses of the CFD simulation for far less computational cost. The development of a reduced polynomial chaos-Kriging (RPC-K) metamodel is another step towards eliminating simulation gridlock by capturing the relevant physics of the problem in a cheap-to-evaluate metamodel using fewer CFD simulations. The RPC-K metamodel is superior to existing technologies because its model reduction methodology eliminates the design parameters which contribute little variance to the problem before fitting a high-fidelity metamodel to the remaining data. This metamodel can capture non-linear physics due to its inclusion of both the long-range trend information of a polynomial chaos expansion and local variations in the simulation data through Kriging. In this thesis, the RPC-K metamodel is developed, validated on a convection-diffusion-reaction problem, and applied to the NACA 4412 airfoil and aircraft engine nacelle problems. This research demonstrates the metamodel's effectiveness over existing polynomial chaos and Kriging metamodels for aerodynamics applications because of its ability to fit non-linear fluid flows with far fewer CFD simulations. This research will allow aerospace engineers to more effectively take advantage of detailed CFD simulations in the development of next-generation aerodynamic bodies through the use of the RPC-K metamodel to save computational cost

    Multielement polynomial chaos Kriging-based metamodelling for Bayesian inference of non-smooth systems

    Get PDF
    This paper presents a surrogate modelling technique based on domain partitioning for Bayesian parameter inference of highly nonlinear engineering models. In order to alleviate the computational burden typically involved in Bayesian inference applications, a multielement Polynomial Chaos Expansion based Kriging metamodel is proposed. The developed surrogate model combines in a piecewise function an array of local Polynomial Chaos based Kriging metamodels constructed on a finite set of non-overlapping subdomains of the stochastic input space. Therewith, the presence of non-smoothness in the response of the forward model (e.g.~ nonlinearities and sparseness) can be reproduced by the proposed metamodel with minimum computational costs owing to its local adaptation capabilities. The model parameter inference is conducted through a Markov chain Monte Carlo approach comprising adaptive exploration and delayed rejection. The efficiency and accuracy of the proposed approach are validated through two case studies, including an analytical benchmark and a numerical case study. The latter relates the partial differential equation governing the hydrogen diffusion phenomenon of metallic materials in Thermal Desorption Spectroscopy tests

    Regression Models and Experimental Designs: A Tutorial for Simulation Analaysts

    Get PDF
    This tutorial explains the basics of linear regression models. especially low-order polynomials. and the corresponding statistical designs. namely, designs of resolution III, IV, V, and Central Composite Designs (CCDs).This tutorial assumes 'white noise', which means that the residuals of the fitted linear regression model are normally, independently, and identically distributed with zero mean.The tutorial gathers statistical results that are scattered throughout the literature on mathematical statistics, and presents these results in a form that is understandable to simulation analysts.metamodels;fractional factorial designs;Plackett-Burman designs;factor interactions;validation;cross-validation

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years
    • …
    corecore