57,938 research outputs found
Sequential design of computer experiments for the estimation of a probability of failure
This paper deals with the problem of estimating the volume of the excursion
set of a function above a given threshold,
under a probability measure on that is assumed to be known. In
the industrial world, this corresponds to the problem of estimating a
probability of failure of a system. When only an expensive-to-simulate model of
the system is available, the budget for simulations is usually severely limited
and therefore classical Monte Carlo methods ought to be avoided. One of the
main contributions of this article is to derive SUR (stepwise uncertainty
reduction) strategies from a Bayesian-theoretic formulation of the problem of
estimating a probability of failure. These sequential strategies use a Gaussian
process model of and aim at performing evaluations of as efficiently as
possible to infer the value of the probability of failure. We compare these
strategies to other strategies also based on a Gaussian process model for
estimating a probability of failure.Comment: This is an author-generated postprint version. The published version
is available at http://www.springerlink.co
Bayesian Subset Simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure
The estimation of small probabilities of failure from computer simulations is
a classical problem in engineering, and the Subset Simulation algorithm
proposed by Au & Beck (Prob. Eng. Mech., 2001) has become one of the most
popular method to solve it. Subset simulation has been shown to provide
significant savings in the number of simulations to achieve a given accuracy of
estimation, with respect to many other Monte Carlo approaches. The number of
simulations remains still quite high however, and this method can be
impractical for applications where an expensive-to-evaluate computer model is
involved. We propose a new algorithm, called Bayesian Subset Simulation, that
takes the best from the Subset Simulation algorithm and from sequential
Bayesian methods based on kriging (also known as Gaussian process modeling).
The performance of this new algorithm is illustrated using a test case from the
literature. We are able to report promising results. In addition, we provide a
numerical study of the statistical properties of the estimator.Comment: 11th International Probabilistic Assessment and Management Conference
(PSAM11) and The Annual European Safety and Reliability Conference (ESREL
2012), Helsinki : Finland (2012
Gaussian process surrogates for failure detection: a Bayesian experimental design approach
An important task of uncertainty quantification is to identify {the
probability of} undesired events, in particular, system failures, caused by
various sources of uncertainties. In this work we consider the construction of
Gaussian {process} surrogates for failure detection and failure probability
estimation. In particular, we consider the situation that the underlying
computer models are extremely expensive, and in this setting, determining the
sampling points in the state space is of essential importance. We formulate the
problem as an optimal experimental design for Bayesian inferences of the limit
state (i.e., the failure boundary) and propose an efficient numerical scheme to
solve the resulting optimization problem. In particular, the proposed
limit-state inference method is capable of determining multiple sampling points
at a time, and thus it is well suited for problems where multiple computer
simulations can be performed in parallel. The accuracy and performance of the
proposed method is demonstrated by both academic and practical examples
Bayesian subset simulation
We consider the problem of estimating a probability of failure ,
defined as the volume of the excursion set of a function above a given threshold, under a given
probability measure on . In this article, we combine the popular
subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our
sequential Bayesian approach for the estimation of a probability of failure
(Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it
possible to estimate when the number of evaluations of is very
limited and is very small. The resulting algorithm is called Bayesian
subset simulation (BSS). A key idea, as in the subset simulation algorithm, is
to estimate the probabilities of a sequence of excursion sets of above
intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A
Gaussian process prior on is used to define the sequence of densities
targeted by the SMC algorithm, and drive the selection of evaluation points of
to estimate the intermediate probabilities. Adaptive procedures are
proposed to determine the intermediate thresholds and the number of evaluations
to be carried out at each stage of the algorithm. Numerical experiments
illustrate that BSS achieves significant savings in the number of function
evaluations with respect to other Monte Carlo approaches
Bounding rare event probabilities in computer experiments
We are interested in bounding probabilities of rare events in the context of
computer experiments. These rare events depend on the output of a physical
model with random input variables. Since the model is only known through an
expensive black box function, standard efficient Monte Carlo methods designed
for rare events cannot be used. We then propose a strategy to deal with this
difficulty based on importance sampling methods. This proposal relies on
Kriging metamodeling and is able to achieve sharp upper confidence bounds on
the rare event probabilities. The variability due to the Kriging metamodeling
step is properly taken into account. The proposed methodology is applied to a
toy example and compared to more standard Bayesian bounds. Finally, a
challenging real case study is analyzed. It consists of finding an upper bound
of the probability that the trajectory of an airborne load will collide with
the aircraft that has released it.Comment: 21 pages, 6 figure
Reliability-based design optimization using kriging surrogates and subset simulation
The aim of the present paper is to develop a strategy for solving
reliability-based design optimization (RBDO) problems that remains applicable
when the performance models are expensive to evaluate. Starting with the
premise that simulation-based approaches are not affordable for such problems,
and that the most-probable-failure-point-based approaches do not permit to
quantify the error on the estimation of the failure probability, an approach
based on both metamodels and advanced simulation techniques is explored. The
kriging metamodeling technique is chosen in order to surrogate the performance
functions because it allows one to genuinely quantify the surrogate error. The
surrogate error onto the limit-state surfaces is propagated to the failure
probabilities estimates in order to provide an empirical error measure. This
error is then sequentially reduced by means of a population-based adaptive
refinement technique until the kriging surrogates are accurate enough for
reliability analysis. This original refinement strategy makes it possible to
add several observations in the design of experiments at the same time.
Reliability and reliability sensitivity analyses are performed by means of the
subset simulation technique for the sake of numerical efficiency. The adaptive
surrogate-based strategy for reliability estimation is finally involved into a
classical gradient-based optimization algorithm in order to solve the RBDO
problem. The kriging surrogates are built in a so-called augmented reliability
space thus making them reusable from one nested RBDO iteration to the other.
The strategy is compared to other approaches available in the literature on
three academic examples in the field of structural mechanics.Comment: 20 pages, 6 figures, 5 tables. Preprint submitted to Springer-Verla
Quantile-based optimization under uncertainties using adaptive Kriging surrogate models
Uncertainties are inherent to real-world systems. Taking them into account is
crucial in industrial design problems and this might be achieved through
reliability-based design optimization (RBDO) techniques. In this paper, we
propose a quantile-based approach to solve RBDO problems. We first transform
the safety constraints usually formulated as admissible probabilities of
failure into constraints on quantiles of the performance criteria. In this
formulation, the quantile level controls the degree of conservatism of the
design. Starting with the premise that industrial applications often involve
high-fidelity and time-consuming computational models, the proposed approach
makes use of Kriging surrogate models (a.k.a. Gaussian process modeling).
Thanks to the Kriging variance (a measure of the local accuracy of the
surrogate), we derive a procedure with two stages of enrichment of the design
of computer experiments (DoE) used to construct the surrogate model. The first
stage globally reduces the Kriging epistemic uncertainty and adds points in the
vicinity of the limit-state surfaces describing the system performance to be
attained. The second stage locally checks, and if necessary, improves the
accuracy of the quantiles estimated along the optimization iterations.
Applications to three analytical examples and to the optimal design of a car
body subsystem (minimal mass under mechanical safety constraints) show the
accuracy and the remarkable efficiency brought by the proposed procedure
- …