4,161 research outputs found

    Calibration of Computational Models with Categorical Parameters and Correlated Outputs via Bayesian Smoothing Spline ANOVA

    Full text link
    It has become commonplace to use complex computer models to predict outcomes in regions where data does not exist. Typically these models need to be calibrated and validated using some experimental data, which often consists of multiple correlated outcomes. In addition, some of the model parameters may be categorical in nature, such as a pointer variable to alternate models (or submodels) for some of the physics of the system. Here we present a general approach for calibration in such situations where an emulator of the computationally demanding models and a discrepancy term from the model to reality are represented within a Bayesian Smoothing Spline (BSS) ANOVA framework. The BSS-ANOVA framework has several advantages over the traditional Gaussian Process, including ease of handling categorical inputs and correlated outputs, and improved computational efficiency. Finally this framework is then applied to the problem that motivated its design; a calibration of a computational fluid dynamics model of a bubbling fluidized which is used as an absorber in a CO2 capture system

    Uncertainty quantification of an effective heat transfer coefficient within a numerical model of a bubbling fluidized bed with immersed horizontal tubes

    Get PDF
    This study investigates sources of steady state computational uncertainty in an effective heat transfer coefficient (HTC) within a non-reacting bubbling fluidized bed with immersed horizontal heat-conducting tubes. The methodical evaluation of this variation, or Uncertainty Quantification (UQ), is a critical step in the experimental analysis process, and is particularly important when the values of input physical parameters are unknown or experimental data is sparse. While the concept applies broadly to all studies, this application investigates a 2D unit cell analogue of a bubbling fluidized bed designed for large-scale carbon capture applications. Without adequate characterization of simulation uncertainties in the HTC, bed operating characteristics, including the thermal efficiency, carbon capture efficiency, and sorbent half-life cannot be well understood. We focus on three primary parameters, solid-solid coefficient of restitution, solid-wall coefficient of restitution, and turbulence model, and consider how their influences vary at different bed solid fractions. This is accomplished via sensitivity analysis and the Bayesian Spline Smoothing (BSS) Analysis of Variance (ANOVA) framework. Results indicate that uncertainties approach 20% at high gas fractions, with the turbulence model accounting for 80% of this variation and the solid-solid coefficient of restitution accounting for the additional 20%

    Introduction to metamodeling for reducing computational burden of advanced analyses with health economic models : a structured overview of metamodeling methods in a 6-step application process

    Get PDF
    Metamodels can be used to reduce the computational burden associated with computationally demanding analyses of simulation models, though applications within health economics are still scarce. Besides a lack of awareness of their potential within health economics, the absence of guidance on the conceivably complex and time-consuming process of developing and validating metamodels may contribute to their limited uptake. To address these issues, this paper introduces metamodeling to the wider health economic audience and presents a process for applying metamodeling in this context, including suitable methods and directions for their selection and use. General (i.e., non-health economic specific) metamodeling literature, clinical prediction modeling literature, and a previously published literature review were exploited to consolidate a process and to identify candidate metamodeling methods. Methods were considered applicable to health economics if they are able to account for mixed (i.e., continuous and discrete) input parameters and continuous outcomes. Six steps were identified as relevant for applying metamodeling methods within health economics, i.e. 1) the identification of a suitable metamodeling technique, 2) simulation of datasets according to a design of experiments, 3) fitting of the metamodel, 4) assessment of metamodel performance, 5) conduct the required analysis using the metamodel, and 6) verification of the results. Different methods are discussed to support each step, including their characteristics, directions for use, key references, and relevant R and Python packages. To address challenges regarding metamodeling methods selection, a first guide was developed towards using metamodels to reduce the computational burden of analyses of health economic models. This guidance may increase applications of metamodeling in health economics, enabling increased use of state-of-the-art analyses, e.g. value of information analysis, with computationally burdensome simulation models

    Spatial--temporal mesoscale modeling of rainfall intensity using gage and radar data

    Full text link
    Gridded estimated rainfall intensity values at very high spatial and temporal resolution levels are needed as main inputs for weather prediction models to obtain accurate precipitation forecasts, and to verify the performance of precipitation forecast models. These gridded rainfall fields are also the main driver for hydrological models that forecast flash floods, and they are essential for disaster prediction associated with heavy rain. Rainfall information can be obtained from rain gages that provide relatively accurate estimates of the actual rainfall values at point-referenced locations, but they do not characterize well enough the spatial and temporal structure of the rainfall fields. Doppler radar data offer better spatial and temporal coverage, but Doppler radar measures effective radar reflectivity (ZeZe) rather than rainfall rate (RR). Thus, rainfall estimates from radar data suffer from various uncertainties due to their measuring principle and the conversion from ZeZe to RR. We introduce a framework to combine radar reflectivity and gage data, by writing the different sources of rainfall information in terms of an underlying unobservable spatial temporal process with the true rainfall values. We use spatial logistic regression to model the probability of rain for both sources of data in terms of the latent true rainfall process. We characterize the different sources of bias and error in the gage and radar data and we estimate the true rainfall intensity with its posterior predictive distribution, conditioning on the observed data.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS166 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore