2,606 research outputs found
Polynomial-Chaos-based Kriging
Computer simulation has become the standard tool in many engineering fields
for designing and optimizing systems, as well as for assessing their
reliability. To cope with demanding analysis such as optimization and
reliability, surrogate models (a.k.a meta-models) have been increasingly
investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging
are two popular non-intrusive meta-modelling techniques. PCE surrogates the
computational model with a series of orthonormal polynomials in the input
variables where polynomials are chosen in coherency with the probability
distributions of those input variables. On the other hand, Kriging assumes that
the computer model behaves as a realization of a Gaussian random process whose
parameters are estimated from the available computer runs, i.e. input vectors
and response values. These two techniques have been developed more or less in
parallel so far with little interaction between the researchers in the two
fields. In this paper, PC-Kriging is derived as a new non-intrusive
meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal
polynomials (PCE) approximates the global behavior of the computational model
whereas Kriging manages the local variability of the model output. An adaptive
algorithm similar to the least angle regression algorithm determines the
optimal sparse set of polynomials. PC-Kriging is validated on various benchmark
analytical functions which are easy to sample for reference results. From the
numerical investigations it is concluded that PC-Kriging performs better than
or at least as good as the two distinct meta-modeling techniques. A larger gain
in accuracy is obtained when the experimental design has a limited size, which
is an asset when dealing with demanding computational models
Meta-models for structural reliability and uncertainty quantification
A meta-model (or a surrogate model) is the modern name for what was
traditionally called a response surface. It is intended to mimic the behaviour
of a computational model M (e.g. a finite element model in mechanics) while
being inexpensive to evaluate, in contrast to the original model which may take
hours or even days of computer processing time. In this paper various types of
meta-models that have been used in the last decade in the context of structural
reliability are reviewed. More specifically classical polynomial response
surfaces, polynomial chaos expansions and kriging are addressed. It is shown
how the need for error estimates and adaptivity in their construction has
brought this type of approaches to a high level of efficiency. A new technique
that solves the problem of the potential biasedness in the estimation of a
probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural
Reliability and its Applications (5th APSSRA) May 2012, Singapor
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
An Efficient Entropy-Based Method for Reliability Assessment by Combining Kriging Meta-Models
Meta-models or surrogate models are convenient tools for reliability assessment of problems with time-consuming numerical models. Recently, an adaptive method called AK-MCS has been widely used for reliability analysis by combining Mont-Carlo simulation method and Kriging surrogate model. The AK-MCS method usually uses constant regression as a Kriging trend. However, other regression trends may have better performance for some problems. So, a method is proposed by combining multiple Kriging meta-models with various trends. The proposed method is based on the maximum entropy of predictions to select training samples. Using multiple Kriging models can reduce the sensitivity to the regression trend. So, the propped method can have better performance for different problems. The proposed method is applied to some examples to show its efficiency
Quantile-based optimization under uncertainties using adaptive Kriging surrogate models
Uncertainties are inherent to real-world systems. Taking them into account is
crucial in industrial design problems and this might be achieved through
reliability-based design optimization (RBDO) techniques. In this paper, we
propose a quantile-based approach to solve RBDO problems. We first transform
the safety constraints usually formulated as admissible probabilities of
failure into constraints on quantiles of the performance criteria. In this
formulation, the quantile level controls the degree of conservatism of the
design. Starting with the premise that industrial applications often involve
high-fidelity and time-consuming computational models, the proposed approach
makes use of Kriging surrogate models (a.k.a. Gaussian process modeling).
Thanks to the Kriging variance (a measure of the local accuracy of the
surrogate), we derive a procedure with two stages of enrichment of the design
of computer experiments (DoE) used to construct the surrogate model. The first
stage globally reduces the Kriging epistemic uncertainty and adds points in the
vicinity of the limit-state surfaces describing the system performance to be
attained. The second stage locally checks, and if necessary, improves the
accuracy of the quantiles estimated along the optimization iterations.
Applications to three analytical examples and to the optimal design of a car
body subsystem (minimal mass under mechanical safety constraints) show the
accuracy and the remarkable efficiency brought by the proposed procedure
- …