48 research outputs found
Sequential design of computer experiments for the estimation of a probability of failure
This paper deals with the problem of estimating the volume of the excursion
set of a function above a given threshold,
under a probability measure on that is assumed to be known. In
the industrial world, this corresponds to the problem of estimating a
probability of failure of a system. When only an expensive-to-simulate model of
the system is available, the budget for simulations is usually severely limited
and therefore classical Monte Carlo methods ought to be avoided. One of the
main contributions of this article is to derive SUR (stepwise uncertainty
reduction) strategies from a Bayesian-theoretic formulation of the problem of
estimating a probability of failure. These sequential strategies use a Gaussian
process model of and aim at performing evaluations of as efficiently as
possible to infer the value of the probability of failure. We compare these
strategies to other strategies also based on a Gaussian process model for
estimating a probability of failure.Comment: This is an author-generated postprint version. The published version
is available at http://www.springerlink.co
Bayesian Subset Simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure
The estimation of small probabilities of failure from computer simulations is
a classical problem in engineering, and the Subset Simulation algorithm
proposed by Au & Beck (Prob. Eng. Mech., 2001) has become one of the most
popular method to solve it. Subset simulation has been shown to provide
significant savings in the number of simulations to achieve a given accuracy of
estimation, with respect to many other Monte Carlo approaches. The number of
simulations remains still quite high however, and this method can be
impractical for applications where an expensive-to-evaluate computer model is
involved. We propose a new algorithm, called Bayesian Subset Simulation, that
takes the best from the Subset Simulation algorithm and from sequential
Bayesian methods based on kriging (also known as Gaussian process modeling).
The performance of this new algorithm is illustrated using a test case from the
literature. We are able to report promising results. In addition, we provide a
numerical study of the statistical properties of the estimator.Comment: 11th International Probabilistic Assessment and Management Conference
(PSAM11) and The Annual European Safety and Reliability Conference (ESREL
2012), Helsinki : Finland (2012
Differentiating the multipoint Expected Improvement for optimal batch design
This work deals with parallel optimization of expensive objective functions
which are modeled as sample realizations of Gaussian processes. The study is
formalized as a Bayesian optimization problem, or continuous multi-armed bandit
problem, where a batch of q > 0 arms is pulled in parallel at each iteration.
Several algorithms have been developed for choosing batches by trading off
exploitation and exploration. As of today, the maximum Expected Improvement
(EI) and Upper Confidence Bound (UCB) selection rules appear as the most
prominent approaches for batch selection. Here, we build upon recent work on
the multipoint Expected Improvement criterion, for which an analytic expansion
relying on Tallis' formula was recently established. The computational burden
of this selection rule being still an issue in application, we derive a
closed-form expression for the gradient of the multipoint Expected Improvement,
which aims at facilitating its maximization using gradient-based ascent
algorithms. Substantial computational savings are shown in application. In
addition, our algorithms are tested numerically and compared to
state-of-the-art UCB-based batch-sequential algorithms. Combining starting
designs relying on UCB with gradient-based EI local optimization finally
appears as a sound option for batch design in distributed Gaussian Process
optimization
Bounding rare event probabilities in computer experiments
We are interested in bounding probabilities of rare events in the context of
computer experiments. These rare events depend on the output of a physical
model with random input variables. Since the model is only known through an
expensive black box function, standard efficient Monte Carlo methods designed
for rare events cannot be used. We then propose a strategy to deal with this
difficulty based on importance sampling methods. This proposal relies on
Kriging metamodeling and is able to achieve sharp upper confidence bounds on
the rare event probabilities. The variability due to the Kriging metamodeling
step is properly taken into account. The proposed methodology is applied to a
toy example and compared to more standard Bayesian bounds. Finally, a
challenging real case study is analyzed. It consists of finding an upper bound
of the probability that the trajectory of an airborne load will collide with
the aircraft that has released it.Comment: 21 pages, 6 figure
Gaussian process surrogates for failure detection: a Bayesian experimental design approach
An important task of uncertainty quantification is to identify {the
probability of} undesired events, in particular, system failures, caused by
various sources of uncertainties. In this work we consider the construction of
Gaussian {process} surrogates for failure detection and failure probability
estimation. In particular, we consider the situation that the underlying
computer models are extremely expensive, and in this setting, determining the
sampling points in the state space is of essential importance. We formulate the
problem as an optimal experimental design for Bayesian inferences of the limit
state (i.e., the failure boundary) and propose an efficient numerical scheme to
solve the resulting optimization problem. In particular, the proposed
limit-state inference method is capable of determining multiple sampling points
at a time, and thus it is well suited for problems where multiple computer
simulations can be performed in parallel. The accuracy and performance of the
proposed method is demonstrated by both academic and practical examples
Screening and metamodeling of computer experiments with functional outputs. Application to thermal-hydraulic computations
To perform uncertainty, sensitivity or optimization analysis on scalar
variables calculated by a cpu time expensive computer code, a widely accepted
methodology consists in first identifying the most influential uncertain inputs
(by screening techniques), and then in replacing the cpu time expensive model
by a cpu inexpensive mathematical function, called a metamodel. This paper
extends this methodology to the functional output case, for instance when the
model output variables are curves. The screening approach is based on the
analysis of variance and principal component analysis of output curves. The
functional metamodeling consists in a curve classification step, a dimension
reduction step, then a classical metamodeling step. An industrial nuclear
reactor application (dealing with uncertainties in the pressurized thermal
shock analysis) illustrates all these steps