693 research outputs found

    Reliability-based design optimization of shells with uncertain geometry using adaptive Kriging metamodels

    Full text link
    Optimal design under uncertainty has gained much attention in the past ten years due to the ever increasing need for manufacturers to build robust systems at the lowest cost. Reliability-based design optimization (RBDO) allows the analyst to minimize some cost function while ensuring some minimal performances cast as admissible failure probabilities for a set of performance functions. In order to address real-world engineering problems in which the performance is assessed through computational models (e.g., finite element models in structural mechanics) metamodeling techniques have been developed in the past decade. This paper introduces adaptive Kriging surrogate models to solve the RBDO problem. The latter is cast in an augmented space that "sums up" the range of the design space and the aleatory uncertainty in the design parameters and the environmental conditions. The surrogate model is used (i) for evaluating robust estimates of the failure probabilities (and for enhancing the computational experimental design by adaptive sampling) in order to achieve the requested accuracy and (ii) for applying a gradient-based optimization algorithm to get optimal values of the design parameters. The approach is applied to the optimal design of ring-stiffened cylindrical shells used in submarine engineering under uncertain geometric imperfections. For this application the performance of the structure is related to buckling which is addressed here by means of a finite element solution based on the asymptotic numerical method

    Bayesian subset simulation

    Full text link
    We consider the problem of estimating a probability of failure α\alpha, defined as the volume of the excursion set of a function f:X⊆Rd→Rf:\mathbb{X} \subseteq \mathbb{R}^{d} \to \mathbb{R} above a given threshold, under a given probability measure on X\mathbb{X}. In this article, we combine the popular subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our sequential Bayesian approach for the estimation of a probability of failure (Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it possible to estimate α\alpha when the number of evaluations of ff is very limited and α\alpha is very small. The resulting algorithm is called Bayesian subset simulation (BSS). A key idea, as in the subset simulation algorithm, is to estimate the probabilities of a sequence of excursion sets of ff above intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A Gaussian process prior on ff is used to define the sequence of densities targeted by the SMC algorithm, and drive the selection of evaluation points of ff to estimate the intermediate probabilities. Adaptive procedures are proposed to determine the intermediate thresholds and the number of evaluations to be carried out at each stage of the algorithm. Numerical experiments illustrate that BSS achieves significant savings in the number of function evaluations with respect to other Monte Carlo approaches

    Reliability and reliability sensitivity analysis of structure by combining adaptive linked importance sampling and Kriging reliability method

    Get PDF
    The application of reliability analysis and reliability sensitivity analysis methods to complicated structures faces two main challenges: small failure probability (typical less than 10−5) and time-demanding mechanical models. This paper proposes an improved active learning surrogate model method, which combines the advantages of the classical Active Kriging – Monte Carlo Simulation (AK-MCS) procedure and the Adaptive Linked Importance Sampling (ALIS) procedure. The proposed procedure can, on the one hand, adaptively produce a series of intermediate sampling density approaching the quasi-optimal Importance Sampling (IS) density, on the other hand, adaptively generate a set of intermediate surrogate models approaching the true failure surface of the rare failure event. Then, the small failure probability and the corresponding reliability sensitivity indices are efficiently estimated by their IS estimators based on the quasi-optimal IS density and the surrogate models. Compared with the classical AK-MCS and Active Kriging – Importance Sampling (AK-IS) procedure, the proposed method neither need to build very large sample pool even when the failure probability is extremely small, nor need to estimate the Most Probable Points (MPPs), thus it is computationally more efficient and more applicable especially for problems with multiple MPPs. The effectiveness and engineering applicability of the proposed method are demonstrated by one numerical test example and two engineering applications

    Reliability assessment of cutting tool life based on surrogate approximation methods

    Get PDF
    A novel reliability estimation approach to the cutting tools based on advanced approximation methods is proposed. Methods such as the stochastic response surface and surrogate modeling are tested, starting from a few sample points obtained through fundamental experiments and extending them to models able to estimate the tool wear as a function of the key process parameters. Subsequently, different reliability analysis methods are employed such as Monte Carlo simulations and first- and second-order reliability methods. In the present study, these reliability analysis methods are assessed for estimating the reliability of cutting tools. The results show that the proposed method is an efficient method for assessing the reliability of the cutting tool based on the minimum number of experimental results. Experimental verification for the case of high-speed turning confirms the findings of the present study for cutting tools under flank wear

    Metamodel-based importance sampling for the simulation of rare events

    Full text link
    In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order to reduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or even impossible) to quantify the error made by this substitution. In this paper an alternative approach is developed. It takes advantage of the kriging meta-modeling and importance sampling techniques. The proposed alternative estimator is finally applied to a finite element based structural reliability analysis.Comment: 8 pages, 3 figures, 1 table. Preprint submitted to ICASP11 Mini-symposia entitled "Meta-models/surrogate models for uncertainty propagation, sensitivity and reliability analysis

    Failure Probability Estimation and Detection of Failure Surfaces via Adaptive Sequential Decomposition of the Design Domain

    Full text link
    We propose an algorithm for an optimal adaptive selection of points from the design domain of input random variables that are needed for an accurate estimation of failure probability and the determination of the boundary between safe and failure domains. The method is particularly useful when each evaluation of the performance function g(x) is very expensive and the function can be characterized as either highly nonlinear, noisy, or even discrete-state (e.g., binary). In such cases, only a limited number of calls is feasible, and gradients of g(x) cannot be used. The input design domain is progressively segmented by expanding and adaptively refining mesh-like lock-free geometrical structure. The proposed triangulation-based approach effectively combines the features of simulation and approximation methods. The algorithm performs two independent tasks: (i) the estimation of probabilities through an ingenious combination of deterministic cubature rules and the application of the divergence theorem and (ii) the sequential extension of the experimental design with new points. The sequential selection of points from the design domain for future evaluation of g(x) is carried out through a new learning function, which maximizes instantaneous information gain in terms of the probability classification that corresponds to the local region. The extension may be halted at any time, e.g., when sufficiently accurate estimations are obtained. Due to the use of the exact geometric representation in the input domain, the algorithm is most effective for problems of a low dimension, not exceeding eight. The method can handle random vectors with correlated non-Gaussian marginals. The estimation accuracy can be improved by employing a smooth surrogate model. Finally, we define new factors of global sensitivity to failure based on the entire failure surface weighted by the density of the input random vector.Comment: 42 pages, 24 figure
    • …
    corecore