2,177 research outputs found

    Emulating dynamic non-linear simulators using Gaussian processes

    Get PDF
    The dynamic emulation of non-linear deterministic computer codes where the output is a time series, possibly multivariate, is examined. Such computer models simulate the evolution of some real-world phenomenon over time, for example models of the climate or the functioning of the human brain. The models we are interested in are highly non-linear and exhibit tipping points, bifurcations and chaotic behaviour. However, each simulation run could be too time-consuming to perform analyses that require many runs, including quantifying the variation in model output with respect to changes in the inputs. Therefore, Gaussian process emulators are used to approximate the output of the code. To do this, the flow map of the system under study is emulated over a short time period. Then, it is used in an iterative way to predict the whole time series. A number of ways are proposed to take into account the uncertainty of inputs to the emulators, after fixed initial conditions, and the correlation between them through the time series. The methodology is illustrated with two examples: the highly non-linear dynamical systems described by the Lorenz and Van der Pol equations. In both cases, the predictive performance is relatively high and the measure of uncertainty provided by the method reflects the extent of predictability in each system

    Predicting the Output From a Stochastic Computer Model When a Deterministic Approximation is Available

    Get PDF
    The analysis of computer models can be aided by the construction of surrogate models, or emulators, that statistically model the numerical computer model. Increasingly, computer models are becoming stochastic, yielding different outputs each time they are run, even if the same input values are used. Stochastic computer models are more difficult to analyse and more difficult to emulate - often requiring substantially more computer model runs to fit. We present a method of using deterministic approximations of the computer model to better construct an emulator. The method is applied to numerous toy examples, as well as an idealistic epidemiology model, and a model from the building performance field

    GPfit: An R package for Gaussian Process Model Fitting using a New Optimization Algorithm

    Full text link
    Gaussian process (GP) models are commonly used statistical metamodels for emulating expensive computer simulators. Fitting a GP model can be numerically unstable if any pair of design points in the input space are close together. Ranjan, Haynes, and Karsten (2011) proposed a computationally stable approach for fitting GP models to deterministic computer simulators. They used a genetic algorithm based approach that is robust but computationally intensive for maximizing the likelihood. This paper implements a slightly modified version of the model proposed by Ranjan et al. (2011), as the new R package GPfit. A novel parameterization of the spatial correlation function and a new multi-start gradient based optimization algorithm yield optimization that is robust and typically faster than the genetic algorithm based approach. We present two examples with R codes to illustrate the usage of the main functions in GPfit. Several test functions are used for performance comparison with a popular R package mlegp. GPfit is a free software and distributed under the general public license, as part of the R software project (R Development Core Team 2012).Comment: 20 pages, 17 image

    Emulating complex dynamical simulators with random Fourier features

    Full text link
    A Gaussian process (GP)-based methodology is proposed to emulate complex dynamical computer models (or simulators). The method relies on emulating the short-time numerical flow map of the system, where the flow map is a function that returns the solution of a dynamical system at a certain time point, given initial conditions. Here, the flow map is emulated via a GP whose kernel is approximated with random Fourier features. This yields a random predictor whose realisations are approximations to the flow map. In order to predict a given time series (i.e., the model output), a single realisation of the approximate flow map is taken and used to iterate from the initial condition ahead in time. Repeating this procedure with multiple realisations from the distribution of approximate flow maps creates a distribution over the time series whose mean and variance serve as the model output prediction and the associated uncertainty, respectively. The proposed method is applied to emulate several dynamic nonlinear simulators including the well-known Lorenz and van der Pol models. The results suggest that our approach has a high predictive performance and the associated uncertainty can capture the dynamics of the system accurately. Additionally, our approach has potential for "embarrassingly" parallel implementations where one can conduct the iterative predictions performed by a realisation on a single computing node

    Preliminary Implementation of the Next Generation Cannulation Simulator

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Extracorporeal Membrane Oxygenation (ECMO) is a highly complex/critical lifesaving procedure known to support patients with cardiac and respiratory issues. Patients on ECMO are monitored 24/7 by a team of highly trained ECMO team comprising nurses, physicians, respiratory therapists, and perfusionists promptly intervening to any potential emergency situation. Simulation-Based Training (SBT) allows clinicians to experience and practice realistic hands-on procedures and scenarios without any risk. In ECMO, cannulation is a critical procedure performed to externally reroute the blood flow so it can be re-oxygenated by the ECMO machine before being recirculated through the patient's body. In a close collaboration with Hamad Medical Corporation (HMC), this project aims to develop a cost effective, realistic, and user-friendly ECMO simulator focusing on the venous and arterial cannulation procedure, The main features of this simulator include cannulation emergencies caused by low pressure flow, excessive force, recirculation, or mispositioned wire/cannula. Therefore, the ECMO cannulation simulator will not only greatly contribute to the initial and ongoing local training of HMC ECMO clinicians but also contribute to improving patient care by lowering the risks associated with the cannulation process

    Emulation of multivariate simulators using thin-plate splines with application to atmospheric dispersion

    No full text
    It is often desirable to build a statistical emulator of a complex computer simulator in order to perform analysis which would otherwise be computationally infeasible. We propose methodology to model multivariate output from a computer simulator taking into account output structure in the responses. The utility of this approach is demonstrated by applying it to a chemical and biological hazard prediction model. Predicting the hazard area which results from an accidental or deliberate chemical or biological release is imperative in civil and military planning and also in emergency response. The hazard area resulting from such a release is highly structured in space and we therefore propose the use of a thin-plate spline to capture the spatial structure and fit a Gaussian process emulator to the coefficients of the resultant basis functions. We compare and contrast four different techniques for emulating multivariate output: dimension-reduction using (i) a fully Bayesian approach with a principal component basis, (ii) a fully Bayesian approach with a thin-plate spline basis, assuming that the basis coefficients are independent, and (iii) a “plug-in” Bayesian approach with a thin-plate spline basis and a separable covariance structure; and (iv) a functional data modeling approach using a tensor-product (separable) Gaussian process. We develop methodology for the two thin-plate spline emulators and demonstrate that these emulators significantly outperform the principal component emulator. Further, the separable thin-plate spline emulator, which accounts for the dependence between basis coefficients, provides substantially more realistic quantification of uncertainty, and is also computationally more tractable, allowing fast emulation. For high resolution output data, it also offers substantial predictive and computational ad- vantages over the tensor-product Gaussian process emulator
    corecore