17,669 research outputs found
Metamodel-based importance sampling for the simulation of rare events
In the field of structural reliability, the Monte-Carlo estimator is
considered as the reference probability estimator. However, it is still
untractable for real engineering cases since it requires a high number of runs
of the model. In order to reduce the number of computer experiments, many other
approaches known as reliability methods have been proposed. A certain approach
consists in replacing the original experiment by a surrogate which is much
faster to evaluate. Nevertheless, it is often difficult (or even impossible) to
quantify the error made by this substitution. In this paper an alternative
approach is developed. It takes advantage of the kriging meta-modeling and
importance sampling techniques. The proposed alternative estimator is finally
applied to a finite element based structural reliability analysis.Comment: 8 pages, 3 figures, 1 table. Preprint submitted to ICASP11
Mini-symposia entitled "Meta-models/surrogate models for uncertainty
propagation, sensitivity and reliability analysis
The safety case and the lessons learned for the reliability and maintainability case
This paper examine the safety case and the lessons learned for the reliability and maintainability case
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
Stochastic simulation methods for structural reliability under mixed uncertainties
Uncertainty quantification (UQ) has been widely recognized as one of the most important, yet challenging task in both structural engineering and system engineering, and the current researches are mainly on the proper treatment of different types of uncertainties, resulting from either natural randomness or lack of information, in all related sub-problems of UQ such as uncertainty characterization, uncertainty propagation, sensitivity analysis, model updating, model validation, risk and reliability analysis, etc. It has been widely accepted that those uncertainties can be grouped as either aleatory uncertainty or epistemic uncertainty, depending on whether they are reducible or not. For dealing with the above challenge, many non-traditional uncertainty characterization models have been developed, and those models can be grouped as either imprecise probability models (e.g., probability-box model, evidence theory, second-order probability model and fuzzy probability model) or non-probabilistic models (e.g., interval/convex model and fuzzy set theory).
This thesis concerns the efficient numerical propagation of the three kinds of uncertainty characterization models, and for simplicity, the precise probability model, the distribution probability-box model, and the interval model are taken as examples. The target is to develop efficient numerical algorithms for learning the functional behavior of the probabilistic responses (e.g., response moments and failure probability) with respect to the epistemic parameters of model inputs, which is especially useful for making reliable decisions even when the available information on model inputs is imperfect.
To achieve the above target, my thesis presents three main developments for improving the Non-intrusive Imprecise Stochastic Simulation (NISS), which is a general methodology framework for propagating the imprecise probability models with only one stochastic simulation. The first development is on generalizing the NISS methods to the problems with inputs including both imprecise probability models and non-probability models. The algorithm is established by combining Bayes rule and kernel density estimation. The sensitivity indices of the epistemic parameters are produced as by-products. The NASA Langley UQ challenge is then successfully solved by using the generalized NISS method. The second development is to inject the classical line sampling to the NISS framework so as to substantially improve the efficiency of the algorithm for rare failure event analysis, and two strategies, based on different interpretations of line sampling, are developed. The first strategy is based on the hyperplane approximations, while the second-strategy is derived based on the one-dimensional integrals. Both strategies can be regarded as post-processing of the classical line sampling, while the results show that their resultant NISS estimators have different performance. The third development aims at further substantially improving the efficiency and suitability to highly nonlinear problems of line sampling, for complex structures and systems where one deterministic simulation may take hours. For doing this, the active learning strategy based on Gaussian process regression is embedded into the line sampling procedure for accurately estimating the interaction point for each sample line, with only a small number of deterministic simulations.
The above three developments have largely improved the suitability and efficiency of the NISS methods, especially for real-world engineering applications. The efficiency and effectiveness of those developments are clearly interpreted with toy examples and sufficiently demonstrated by real-world test examples in system engineering, civil engineering, and mechanical engineering
Simplified Estimation of Economic Seismic Risk for Buildings
A seismic risk assessment is often performed on behalf of a buyer of
commercial buildings in seismically active regions. One outcome of the assessment is that a probable maximum loss (PML) is computed. PML is of
limited use to real-estate investors as it has no place in a standard financial
analysis and reflects too long a planning period. We introduce an alternative
to PML called probable frequent loss (PFL), defined as the mean loss resulting from shaking with 10% exceedance probability in 5 years. PFL is approximately related to expected annualized loss (EAL) through a site economic hazard coefficient (H) introduced here. PFL and EAL offer three
advantages over PML: (1) meaningful planning period; (2) applicability in financial analysis (making seismic risk a potential market force); and (3) can
be estimated using a single linear structural analysis, via a simplified method
called linear assembly-based vulnerability (LABV) that is presented in this
work. We also present a simple decision-analysis framework for real-estate
investments in seismic regions, accounting for risk aversion. We show that
market risk overwhelms uncertainty in seismic risk, allowing one to consider
only expected consequences in seismic risk. We illustrate using 15 buildings,
including a 7-story nonductile reinforced-concrete moment-frame building in
Van Nuys, California, and 14 buildings from the CUREE-Caltech Woodframe Project
- …