48,174 research outputs found

    Reliability-based design optimization using kriging surrogates and subset simulation

    Full text link
    The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-point-based approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradient-based optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics.Comment: 20 pages, 6 figures, 5 tables. Preprint submitted to Springer-Verla

    Non-intrusive stochastic analysis with parameterized imprecise probability models: II. Reliability and rare events analysis

    Get PDF
    © 2019 Elsevier Ltd Structural reliability analysis for rare failure events in the presence of hybrid uncertainties is a challenging task drawing increasing attentions in both academic and engineering fields. Based on the new imprecise stochastic simulation framework developed in the companion paper, this work aims at developing efficient methods to estimate the failure probability functions subjected to rare failure events with the hybrid uncertainties being characterized by imprecise probability models. The imprecise stochastic simulation methods are firstly improved by the active learning procedure so as to reduce the computational costs. For the more challenging rare failure events, two extended subset simulation based sampling methods are proposed to provide better performances in both local and global parameter spaces. The computational costs of both methods are the same with the classical subset simulation method. These two methods are also combined with the active learning procedure so as to further substantially reduce the computational costs. The estimation errors of all the methods are analyzed based on sensitivity indices and statistical properties of the developed estimators. All these new developments enrich the imprecise stochastic simulation framework. The feasibility and efficiency of the proposed methods are demonstrated with numerical and engineering test examples

    Structural Reliability Assessment under Fire.

    Full text link
    Structural safety under fire has received significant attention in recent years. Current approaches to structural fire design are based on prescriptive codes that emphasize insulation of steel members to achieve adequate fire resistance. The prescriptive approach fails to give a measure of the true performance of structural systems in fire and gives no indication of the level of reliability provided by the structure in the face of uncertainty. The performance-based design methodology overcomes many of the limitations of the prescriptive approach. The quantification of the structural reliability is a key component of performance-based design as it provides an objective manner of comparing alternative design solutions. In this study, a probabilistic framework is established to evaluate the structural reliability under fire considering uncertainties that exist in the system. The structural performance subjected to realistic fires is estimated by numerical simulations of sequentially coupled fire, thermal, and structural analyses. In this dissertation, multiple reliability methods (i.e., Latin hypercube simulation, subset simulation, and the first/second order reliability methods) are extended to investigate the structural safety under fire. The reliability analysis of structures in fire involves (i) the identification and characterization of uncertain parameters in the system, (ii) a probabilistic analysis of the thermo-mechanical response of the structure, and (iii) the evaluation of structural reliability based on a suitable limit state function. Several applications are considered involving the response of steel and steel-concrete composite structures subjected to natural fires. Parameters in the fire, thermal, and structural models are characterized, and an improved fire hazard model is proposed that accounts for fire spread to adjacent rooms. The importance of various parameters is determined by considering the response sensitivity, which is determined by finite difference and direct differentiation methods. The accuracy and efficiency of the various reliability methods, as applied to structures in fire, are compared, and the strengths and weaknesses of each approach are identified.PhDCivil EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/111381/1/qianru_1.pd

    Meta-models for structural reliability and uncertainty quantification

    Get PDF
    A meta-model (or a surrogate model) is the modern name for what was traditionally called a response surface. It is intended to mimic the behaviour of a computational model M (e.g. a finite element model in mechanics) while being inexpensive to evaluate, in contrast to the original model which may take hours or even days of computer processing time. In this paper various types of meta-models that have been used in the last decade in the context of structural reliability are reviewed. More specifically classical polynomial response surfaces, polynomial chaos expansions and kriging are addressed. It is shown how the need for error estimates and adaptivity in their construction has brought this type of approaches to a high level of efficiency. A new technique that solves the problem of the potential biasedness in the estimation of a probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural Reliability and its Applications (5th APSSRA) May 2012, Singapor

    Quantile-based optimization under uncertainties using adaptive Kriging surrogate models

    Full text link
    Uncertainties are inherent to real-world systems. Taking them into account is crucial in industrial design problems and this might be achieved through reliability-based design optimization (RBDO) techniques. In this paper, we propose a quantile-based approach to solve RBDO problems. We first transform the safety constraints usually formulated as admissible probabilities of failure into constraints on quantiles of the performance criteria. In this formulation, the quantile level controls the degree of conservatism of the design. Starting with the premise that industrial applications often involve high-fidelity and time-consuming computational models, the proposed approach makes use of Kriging surrogate models (a.k.a. Gaussian process modeling). Thanks to the Kriging variance (a measure of the local accuracy of the surrogate), we derive a procedure with two stages of enrichment of the design of computer experiments (DoE) used to construct the surrogate model. The first stage globally reduces the Kriging epistemic uncertainty and adds points in the vicinity of the limit-state surfaces describing the system performance to be attained. The second stage locally checks, and if necessary, improves the accuracy of the quantiles estimated along the optimization iterations. Applications to three analytical examples and to the optimal design of a car body subsystem (minimal mass under mechanical safety constraints) show the accuracy and the remarkable efficiency brought by the proposed procedure

    Bayesian Subset Simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure

    Full text link
    The estimation of small probabilities of failure from computer simulations is a classical problem in engineering, and the Subset Simulation algorithm proposed by Au & Beck (Prob. Eng. Mech., 2001) has become one of the most popular method to solve it. Subset simulation has been shown to provide significant savings in the number of simulations to achieve a given accuracy of estimation, with respect to many other Monte Carlo approaches. The number of simulations remains still quite high however, and this method can be impractical for applications where an expensive-to-evaluate computer model is involved. We propose a new algorithm, called Bayesian Subset Simulation, that takes the best from the Subset Simulation algorithm and from sequential Bayesian methods based on kriging (also known as Gaussian process modeling). The performance of this new algorithm is illustrated using a test case from the literature. We are able to report promising results. In addition, we provide a numerical study of the statistical properties of the estimator.Comment: 11th International Probabilistic Assessment and Management Conference (PSAM11) and The Annual European Safety and Reliability Conference (ESREL 2012), Helsinki : Finland (2012

    A new adaptive response surface method for reliability analysis

    Get PDF
    Response surface method is a convenient tool to assess reliability for a wide range of structural mechanical problems. More specifically, adaptive schemes which consist in iteratively refine the experimental design close to the limit state have received much attention. However, it is generally difficult to take into account a lot of variables and to well handle approximation error. The method, proposed in this paper, addresses these points using sparse response surface and a relevant criterion for results accuracy. For this purpose, a response surface is built from an initial Latin Hypercube Sampling (LHS) where the most significant terms are chosen from statistical criteria and cross-validation method. At each step, LHS is refined in a region of interest defined with respect to an importance level on probability density in the design point. Two convergence criteria are used in the procedure: The first one concerns localization of the region and the second one the response surface quality. Finally, a bootstrap method is used to determine the influence of the response error on the estimated probability of failure. This method is applied to several examples and results are discussed

    Bayesian Updating, Model Class Selection and Robust Stochastic Predictions of Structural Response

    Get PDF
    A fundamental issue when predicting structural response by using mathematical models is how to treat both modeling and excitation uncertainty. A general framework for this is presented which uses probability as a multi-valued conditional logic for quantitative plausible reasoning in the presence of uncertainty due to incomplete information. The fundamental probability models that represent the structure’s uncertain behavior are specified by the choice of a stochastic system model class: a set of input-output probability models for the structure and a prior probability distribution over this set that quantifies the relative plausibility of each model. A model class can be constructed from a parameterized deterministic structural model by stochastic embedding utilizing Jaynes’ Principle of Maximum Information Entropy. Robust predictive analyses use the entire model class with the probabilistic predictions of each model being weighted by its prior probability, or if structural response data is available, by its posterior probability from Bayes’ Theorem for the model class. Additional robustness to modeling uncertainty comes from combining the robust predictions of each model class in a set of competing candidates weighted by the prior or posterior probability of the model class, the latter being computed from Bayes’ Theorem. This higherlevel application of Bayes’ Theorem automatically applies a quantitative Ockham razor that penalizes the data-fit of more complex model classes that extract more information from the data. Robust predictive analyses involve integrals over highdimensional spaces that usually must be evaluated numerically. Published applications have used Laplace's method of asymptotic approximation or Markov Chain Monte Carlo algorithms

    Computing derivative-based global sensitivity measures using polynomial chaos expansions

    Full text link
    In the field of computer experiments sensitivity analysis aims at quantifying the relative importance of each input parameter (or combinations thereof) of a computational model with respect to the model output uncertainty. Variance decomposition methods leading to the well-known Sobol' indices are recognized as accurate techniques, at a rather high computational cost though. The use of polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to alleviate the computational burden though. However, when dealing with large dimensional input vectors, it is good practice to first use screening methods in order to discard unimportant variables. The {\em derivative-based global sensitivity measures} (DGSM) have been developed recently in this respect. In this paper we show how polynomial chaos expansions may be used to compute analytically DGSMs as a mere post-processing. This requires the analytical derivation of derivatives of the orthonormal polynomials which enter PC expansions. The efficiency of the approach is illustrated on two well-known benchmark problems in sensitivity analysis
    corecore