203 research outputs found
Computing derivative-based global sensitivity measures using polynomial chaos expansions
In the field of computer experiments sensitivity analysis aims at quantifying
the relative importance of each input parameter (or combinations thereof) of a
computational model with respect to the model output uncertainty. Variance
decomposition methods leading to the well-known Sobol' indices are recognized
as accurate techniques, at a rather high computational cost though. The use of
polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to
alleviate the computational burden though. However, when dealing with large
dimensional input vectors, it is good practice to first use screening methods
in order to discard unimportant variables. The {\em derivative-based global
sensitivity measures} (DGSM) have been developed recently in this respect. In
this paper we show how polynomial chaos expansions may be used to compute
analytically DGSMs as a mere post-processing. This requires the analytical
derivation of derivatives of the orthonormal polynomials which enter PC
expansions. The efficiency of the approach is illustrated on two well-known
benchmark problems in sensitivity analysis
Global sensitivity analysis for stochastic simulators based on generalized lambda surrogate models
Global sensitivity analysis aims at quantifying the impact of input
variability onto the variation of the response of a computational model. It has
been widely applied to deterministic simulators, for which a set of input
parameters has a unique corresponding output value. Stochastic simulators,
however, have intrinsic randomness due to their use of (pseudo)random numbers,
so they give different results when run twice with the same input parameters
but non-common random numbers. Due to this random nature, conventional Sobol'
indices, used in global sensitivity analysis, can be extended to stochastic
simulators in different ways. In this paper, we discuss three possible
extensions and focus on those that depend only on the statistical dependence
between input and output. This choice ignores the detailed data generating
process involving the internal randomness, and can thus be applied to a wider
class of problems. We propose to use the generalized lambda model to emulate
the response distribution of stochastic simulators. Such a surrogate can be
constructed without the need for replications. The proposed method is applied
to three examples including two case studies in finance and epidemiology. The
results confirm the convergence of the approach for estimating the sensitivity
indices even with the presence of strong heteroskedasticity and small
signal-to-noise ratio
Metamodel-based importance sampling for the simulation of rare events
In the field of structural reliability, the Monte-Carlo estimator is
considered as the reference probability estimator. However, it is still
untractable for real engineering cases since it requires a high number of runs
of the model. In order to reduce the number of computer experiments, many other
approaches known as reliability methods have been proposed. A certain approach
consists in replacing the original experiment by a surrogate which is much
faster to evaluate. Nevertheless, it is often difficult (or even impossible) to
quantify the error made by this substitution. In this paper an alternative
approach is developed. It takes advantage of the kriging meta-modeling and
importance sampling techniques. The proposed alternative estimator is finally
applied to a finite element based structural reliability analysis.Comment: 8 pages, 3 figures, 1 table. Preprint submitted to ICASP11
Mini-symposia entitled "Meta-models/surrogate models for uncertainty
propagation, sensitivity and reliability analysis
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
- …