1,030 research outputs found

    Comparison of data-driven uncertainty quantification methods for a carbon dioxide storage benchmark scenario

    Full text link
    A variety of methods is available to quantify uncertainties arising with\-in the modeling of flow and transport in carbon dioxide storage, but there is a lack of thorough comparisons. Usually, raw data from such storage sites can hardly be described by theoretical statistical distributions since only very limited data is available. Hence, exact information on distribution shapes for all uncertain parameters is very rare in realistic applications. We discuss and compare four different methods tested for data-driven uncertainty quantification based on a benchmark scenario of carbon dioxide storage. In the benchmark, for which we provide data and code, carbon dioxide is injected into a saline aquifer modeled by the nonlinear capillarity-free fractional flow formulation for two incompressible fluid phases, namely carbon dioxide and brine. To cover different aspects of uncertainty quantification, we incorporate various sources of uncertainty such as uncertainty of boundary conditions, of conceptual model definitions and of material properties. We consider recent versions of the following non-intrusive and intrusive uncertainty quantification methods: arbitary polynomial chaos, spatially adaptive sparse grids, kernel-based greedy interpolation and hybrid stochastic Galerkin. The performance of each approach is demonstrated assessing expectation value and standard deviation of the carbon dioxide saturation against a reference statistic based on Monte Carlo sampling. We compare the convergence of all methods reporting on accuracy with respect to the number of model runs and resolution. Finally we offer suggestions about the methods' advantages and disadvantages that can guide the modeler for uncertainty quantification in carbon dioxide storage and beyond

    Bayesian design of experiments for complex chemical systems

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Chemical Engineering, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 317-322).Engineering design work relies on the ability to predict system performance. A great deal of effort is spent producing models that incorporate knowledge of the underlying physics and chemistry in order to understand the relationship between system inputs and responses. Although models can provide great insight into the behavior of the system, actual design decisions cannot be made based on predictions alone. In order to make properly informed decisions, it is critical to understand uncertainty. Otherwise, there cannot be a quantitative assessment of which predictions are reliable and which inputs are most significant. To address this issue, a new design method is required that can quantify the complex sources of uncertainty that influence model predictions and the corresponding engineering decisions. Design of experiments is traditionally defined as a structured procedure to gather information. This thesis reframes design of experiments as a problem of quantifying and managing uncertainties. The process of designing experimental studies is treated as a statistical decision problem using Bayesian methods. This perspective follows from the realization that the primary role of engineering experiments is not only to gain knowledge but to gather the necessary information to make future design decisions. To do this, experiments must be designed to reduce the uncertainties relevant to the future decision. The necessary components are: a model of the system, a model of the observations taken from the system, and an understanding of the sources of uncertainty that impact the system. While the Bayesian approach has previously been attempted in various fields including Chemical Engineering the true benefit has been obscured by the use of linear system models, simplified descriptions of uncertainty, and the lack of emphasis on the decision theory framework. With the recent development of techniques for Bayesian statistics and uncertainty quantification, including Markov Chain Monte Carlo, Polynomial Chaos Expansions, and a prior sampling formulation for computing utility functions, such simplifications are no longer necessary. In this work, these methods have been integrated into the decision theory framework to allow the application of Bayesian Designs to more complex systems. The benefits of the Bayesian approach to design of experiments are demonstrated on three systems: an air mill classifier, a network of chemical reactions, and a process simulation based on unit operations. These case studies quantify the impact of rigorous modeling of uncertainty in terms of reduced number of experiments as compared to the currently used Classical Design methods. Fewer experiments translate to less time and resources spent, while reducing the important uncertainties relevant to decision makers. In an industrial setting, this represents real world benefits for large research projects in reducing development costs and time-to-market. Besides identifying the best experiments, the Bayesian approach also allows a prediction of the value of experimental data which is crucial in the decision making process. Finally, this work demonstrates the flexibility of the decision theory framework and the feasibility of Bayesian Design of Experiments for the complex process models commonly found in the field of Chemical Engineering.by Kenneth T. Hu.Ph.D

    Lessons in uncertainty quantification for turbulent dynamical systems

    Get PDF

    Design and optimization under uncertainty of Energy Systems

    Get PDF
    In many engineering design and optimisation problems, the presence of uncertainty in data and parameters is a central and critical issue. The analysis and design of advanced complex energy systems is generally performed starting from a single operating condition and assuming a series of design and operating parameters as fixed values. However, many of the variables on which the design is based are subject to uncertainty because they are not determinable with an adequate precision and they can affect both performance and cost. Uncertainties stem naturally from our limitations in measurements, predictions and manufacturing, and we can say that any system used in engineering is subject to some degree of uncertainty. Different fields of engineering use different ways to describe this uncertainty and adopt a variety of techniques to approach the problem. The past decade has seen a significant growth of research and development in uncertainty quantification methods to analyse the propagation of uncertain inputs through the systems. One of the main challenges in this field are identifying sources of uncertainty that potentially affect the outcomes and the efficiency in propagating these uncertainties from the sources to the quantities of interest, especially when there are many sources of uncertainties. Hence, the level of rigor in uncertainty analysis depends on the quality of uncertainty quantification method. The main obstacle of this analysis is often the computational effort, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation through a non-intrusive approach with as few evaluations as possible. The primary goal of this work is to show a robust method for uncertainty quantification applied to energy systems. The first step in this direction was made doing a work on the analysis of uncertainties on a recuperator for micro gas turbines, making use of the Monte Carlo and Response Sensitivity Analysis methodologies to perform this study. However, when considering more complex energy systems, one of the main weaknesses of uncertainty quantification methods arises: the extremely high computational effort needed. For this reason, the application of a so-called metamodel was found necessary and useful. This approach was applied to perform a complete analysis under uncertainty of a solid oxide fuel cell hybrid system, starting from the evaluation of the impact of several uncertainties on the system up to a robust design including a multi-objective optimization. The response surfaces have allowed the authors to consider the uncertainties in the system when performing an acceptable number of simulations. These response were then used to perform a Monte Carlo simulation to evaluate the impact of the uncertainties on the monitored outputs, giving an insight on the spread of the resulting probability density functions and so on the outputs which should be considered more carefully during the design phase. Finally, the analysis of a complex combined cycle with a flue gas condesing heat pump subject to market uncertainties was performed. To consider the uncertainties in the electrical price, which would impact directly the revenues of the system, a statistical study on the behaviour of such price along the years was performed. From the data obtained it was possible to create a probability density function for each hour of the day which would represent its behaviour, and then those distributions were used to analyze the variability of the system in terms of revenues and emissions

    Uncertainty-aware Validation Benchmarks for Coupling Free Flow and Porous-Medium Flow

    Full text link
    A correct choice of interface conditions and useful model parameters for coupled free-flow and porous-medium systems is vital for physically consistent modeling and accurate numerical simulations of applications. We consider the Stokes--Darcy problem with different models for the porous-medium compartment and corresponding coupling strategies: the standard averaged model based on Darcy's law with classical or generalized interface conditions, as well as the pore-network model. We study the coupled flow problems' behaviors considering a benchmark case where a pore-scale resolved model provides the reference solution and quantify the uncertainties in the models' parameters and the reference data. To achieve this, we apply a statistical framework that incorporates a probabilistic modeling technique using a fully Bayesian approach. A Bayesian perspective on a validation task yields an optimal bias-variance trade-off against the reference data. It provides an integrative metric for model validation that incorporates parameter and conceptual uncertainty. Additionally, a model reduction technique, namely Bayesian Sparse Polynomial Chaos Expansion, is employed to accelerate the calibration and validation processes for computationally demanding Stokes--Darcy models with different coupling strategies. We perform uncertainty-aware validation, demonstrate each model's predictive capabilities, and make a model comparison using a Bayesian validation metric

    Methodology for technology evaluation under uncertainty and its application in advanced coal gasification processes

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Chemical Engineering, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 273-287).Integrated gasification combined cycle (IGCC) technology has attracted interest as a cleaner alternative to conventional coal-fired power generation processes. While a number of pilot projects have been launched to experimentally test IGCC technologies, mathematical simulation remains a central part of the ongoing research efforts. A major challenge in modeling an IGCC power plant is the lack of real experience and reliable data. It is critical to properly understand the state of knowledge and evaluate the impact of uncertainty in every phase of the R&D process. A rigorous investigation of the effect of uncertainty on IGCC system requires accurate quantification of input uncertainty and efficient propagation of uncertainty through system models. This thesis proposes several uncertainty quantification methods which expand the sources of information that can be used for parameter estimation. Key features of these methods include the use of entropy maximization to translate subjective opinions to probability distribution functions, and a more flexible probability model that easily captures anomaly associated with small sample data. In addition, Bayesian estimation is extended to dynamic models. Aided by a computationally efficient algorithm, termed sequential Monte Carlo method, the Bayesian approach is shown to be an effective way to estimate time-variant parameters. Uncertainty propagation is performed using the deterministic equivalent modeling method (DEMM) which is based on polynomial chaos representation of random variables and probabilistic collocation algorithm. One major issue often overlooked in the analysis of IGCC models is to represent correlation in the input parameters. This thesis proposes the use of principal component analysis (PCA) to represent correlated random variables. The resulting formulation is the same as the truncated Karhunen-Lodve expansions. Explicit incorporation of correlation not only improves accuracy of the approximation but also reduces the overall computational time. A comprehensive study of the MIT-BP IGCC model is carried out to determine uncertainties of the key measures of performance and cost, including energy output, thermal efficiency, CO 2 emission, plant capital cost, and cost of electricity. Whenever possible, the probability distributions of input parameters are estimated based on realistic data. Experts' judgments are solicited if data acquisition is infeasible. Uncertainty analysis is conducted in a three-step approach. First, technology-related input parameters are taken into account to determine uncertainties of plant performance. Second, cost uncertainties are determined with only economic inputs in order to identify important economic parameters. Finally, the plant model is integrated with cost model and they are evaluated with the key technical and economic inputs identified in the previous steps. Our study indicates the property of coal feed has a substantial impact on the energy production of the IGCC plant, and subsequently on the cost of electricity. Immature technologies such as gasification and gas turbine have important bearing on model performance hence need to be addressed in future research.by Bo Gong.Ph.D
    corecore