52,340 research outputs found
Sensitivity analysis of expensive black-box systems using metamodeling
Simulations are becoming ever more common as a tool for designing complex
products. Sensitivity analysis techniques can be applied to these simulations
to gain insight, or to reduce the complexity of the problem at hand. However,
these simulators are often expensive to evaluate and sensitivity analysis
typically requires a large amount of evaluations. Metamodeling has been
successfully applied in the past to reduce the amount of required evaluations
for design tasks such as optimization and design space exploration. In this
paper, we propose a novel sensitivity analysis algorithm for variance and
derivative based indices using sequential sampling and metamodeling. Several
stopping criteria are proposed and investigated to keep the total number of
evaluations minimal. The results show that both variance and derivative based
techniques can be accurately computed with a minimal amount of evaluations
using fast metamodels and FLOLA-Voronoi or density sequential sampling
algorithms.Comment: proceedings of winter simulation conference 201
Uncertainty quantification of coal seam gas production prediction using Polynomial Chaos
A surrogate model approximates a computationally expensive solver. Polynomial
Chaos is a method to construct surrogate models by summing combinations of
carefully chosen polynomials. The polynomials are chosen to respect the
probability distributions of the uncertain input variables (parameters); this
allows for both uncertainty quantification and global sensitivity analysis.
In this paper we apply these techniques to a commercial solver for the
estimation of peak gas rate and cumulative gas extraction from a coal seam gas
well. The polynomial expansion is shown to honour the underlying geophysics
with low error when compared to a much more complex and computationally slower
commercial solver. We make use of advanced numerical integration techniques to
achieve this accuracy using relatively small amounts of training data
Surrogate based Global Sensitivity Analysis of ADM1-based Anaerobic Digestion Model
In order to calibrate the model parameters, Sensitivity Analysis routines are mandatory to rank the parameters by their relevance and fix to nominal values the least influential factors. Despite the high number of works based on ADM1, very few are related to sensitivity analysis. In this study Global Sensitivity Analysis (GSA) and Uncertainty Quantification (UQ) for an ADM1-based Anaerobic Digestion Model have been performed. The modified version of ADM-based model selected in this study was presented by Esposito and co-authors in 2013. Unlike the first version of ADM1, focused on sewage sludge degradation, the model of Esposito is focused on organic fraction of municipal solid waste digestion. It his recalled that in many applications the hydrolysis is considered the bottleneck of the overall anaerobic digestion process when the input substrate is constituted of complex organic matter. In Esposito's model a surfaced based kinetic approach for the disintegration of complex organic matter is introduced. This approach allows to better model the disintegration step taking into account the effect of particle size distribution on the digestion process. This model needs thus GSA and UQ to pave the way for further improvements and reach a deep understanding of the main processes and leading input factors. Due to the large number of parameters to be analyzed a first preliminary screening analysis, with the Morris' Method, has been conducted. Since two quantities of interest (QoI) have been considered, the initial screening has been performed twice, obtaining two set of parameters containing the most influential factors in determining the value of each QoI. A surrogate of ADM1 model has been defined making use of the two defined quantities of interest. The output results from the surrogate model have been analyzed with Sobol’ indices for the quantitative GSA. Finally, uncertainty quantification has been performed. By adopting kernel smoothing techniques, the Probability Density Functions of each quantity of interest have been defined
Reliability-based design optimization using kriging surrogates and subset simulation
The aim of the present paper is to develop a strategy for solving
reliability-based design optimization (RBDO) problems that remains applicable
when the performance models are expensive to evaluate. Starting with the
premise that simulation-based approaches are not affordable for such problems,
and that the most-probable-failure-point-based approaches do not permit to
quantify the error on the estimation of the failure probability, an approach
based on both metamodels and advanced simulation techniques is explored. The
kriging metamodeling technique is chosen in order to surrogate the performance
functions because it allows one to genuinely quantify the surrogate error. The
surrogate error onto the limit-state surfaces is propagated to the failure
probabilities estimates in order to provide an empirical error measure. This
error is then sequentially reduced by means of a population-based adaptive
refinement technique until the kriging surrogates are accurate enough for
reliability analysis. This original refinement strategy makes it possible to
add several observations in the design of experiments at the same time.
Reliability and reliability sensitivity analyses are performed by means of the
subset simulation technique for the sake of numerical efficiency. The adaptive
surrogate-based strategy for reliability estimation is finally involved into a
classical gradient-based optimization algorithm in order to solve the RBDO
problem. The kriging surrogates are built in a so-called augmented reliability
space thus making them reusable from one nested RBDO iteration to the other.
The strategy is compared to other approaches available in the literature on
three academic examples in the field of structural mechanics.Comment: 20 pages, 6 figures, 5 tables. Preprint submitted to Springer-Verla
- …