35 research outputs found

    A nonlinear wavelet density-based importance sampling for reliability analysis

    No full text
    Importance sampling is a commonly used variance reduction technique for estimating reliability of a structural system. The performance of importance sampling is critically dependent on the choice of the sampling density. For the commonly used adaptive importance sampling method, the construction of the sampling density relies on the kernel-based density estimation. However, the choice of the initial bandwidth of the local windows may heavily affect the accuracy of the kernel method, particularly when the number of samples is not very large. To overcome this difficulty, this study develops a new adaptive importance sampling method based on nonlinear wavelet thresholding density estimator. The method utilizes the adaptive Markov chain simulation to generate samples that can adaptively populate the important region. The importance sampling density is then constructed using nonparametric wavelet density to implement the importance sampling. The methods takes advantage of the attractive properties of the Daubechies’ wavelet family (e.g., localization, various degrees of smoothness, and fast implementation) to provide good density estimations. Compared with the kernel density estimator, the nonlinear wavelet thresholding density estimator has a high degree of flexibility in terms of convergence rate and smoothness. Moreover, the choice of the initial parameters slightly affects the accuracy of the method. Two examples are given to demonstrate the proposed method.Non UBCUnreviewedThis collection contains the proceedings of ICASP12, the 12th International Conference on Applications of Statistics and Probability in Civil Engineering held in Vancouver, Canada on July 12-15, 2015. Abstracts were peer-reviewed and authors of accepted abstracts were invited to submit full papers. Also full papers were peer reviewed. The editor for this collection is Professor Terje Haukaas, Department of Civil Engineering, UBC Vancouver.Facult

    Efficient structural reliability analysis via a weak-intrusive stochastic finite element method

    No full text
    This paper presents a novel methodology for structural reliability analysis by means of the stochastic finite element method (SFEM). The key issue of structural reliability analysis is to determine the limit state function and corresponding multidimensional integral that are usually related to the structural stochastic displacement and/or its derivative, e.g., the stress and strain. In this paper, a novel weak-intrusive SFEM is first used to calculate structural stochastic displacements of all spatial positions. In this method, the stochastic displacement is decoupled into a combination of a series of deterministic displacements with random variable coefficients. An iterative algorithm is then given to solve the deterministic displacements and the corresponding random variables. Based on the stochastic displacement obtained by the SFEM, the limit state function described by the stochastic displacement (and/or its derivative) and the corresponding multidimensional integral encountered in reliability analysis can be calculated in a straightforward way. Failure probabilities of all spatial positions can be obtained at once since the stochastic displacements of all spatial points have been known by using the proposed SFEM. Furthermore, the proposed method can be applied to high-dimensional stochastic problems without any modification. One of the most challenging problems encountered in high-dimensional reliability analysis, known as the curse of dimensionality, can be circumvented with great success. Three numerical examples, including low- and high-dimensional reliability analysis, are given to demonstrate the good accuracy and the high efficiency of the proposed method

    A new method for stochastic analysis of structures under limited observations

    No full text
    Reasonable modeling of non-Gaussian system inputs from limited observations and efficient propagation of system response are of great significance in uncertain analysis of real engineering problems. In this paper, we develop a new method for the construction of non-Gaussian random model and associated propagation of response under limited observations. Our method firstly develops a new kernel density estimation-based (KDE-based) random model based on Karhunen-Loeve (KL) expansion of observations of uncertain parameters. By further implementing the arbitrary polynomial chaos (aPC) formulation on KL vector with dependent measure, the associated aPC-based response propagation is then developed. In our method, the developed KDE-based model can accurately represent the input parameters from limited observations as the new KDE of KL vector can incorporate the inherent relation between marginals of input parameters and distribution of univariate KL variables. In addition, the aPC formulation can be effectively determined for uncertain analysis by virtue of the mixture representation of the developed KDE of KL vector. Furthermore, the system response can be propagated in a stable and accurate way with the developed D-optimal weighted regression method by the equivalence between the distribution of underlying aPC variables and that of KL vector. In this way, the current work provides an effective framework for the reasonable stochastic modeling and efficient response propagation of real-life engineering systems with limited observations. Two numerical examples, including the analysis of structures subjected to random seismic ground motion, are presented to highlight the effectiveness of the proposed method

    A new maximum entropy-based importance sampling for reliability analysis

    No full text
    Importance sampling can be highly efficient if a good importance sampling density is constructed. Although the parametric sampling densities centered on the design points are often good choices, the determination of the design points can be a difficult and inefficient task itself, especially for problems with multiple design points, or highly nonlinear limit state functions. This paper introduces a nonparametric importance sampling method based on the Markov chain simulation and maximum-entropy density estimation (MEDE). In the proposed method, Markov chain simulation is utilized to generate samples that distribute asymptotically to the optimal importance sampling density. A nonparametric estimation of the optimal importance sampling density is then obtained using the MEDE technique. The conventional MEDE method is difficult for multi-dimensional problems as it needs to solve a set of simultaneous nonlinear integral equations. This paper developed a new MEDE technique for multivariate dataset. The method starts with using histogram to approximate a density. The multi-dimensional histogram is converted into a series of one-dimensional conditional PDFs in each dimension and the density is reconstructed by means of orthogonal expansion. Thus, the solution of MEDE is converted to a set of coefficients of the Legendre polynomials. The new importance sampling method is illustrated and compared with the classical kernel-based importance sampling using a number of numerical and structural examples.National Natural Science Foundation of China Australian Research Counci

    Scalable risk assessment of large infrastructure systems with spatially correlated components

    No full text
    Risk assessment of spatially distributed infrastructure systems under natural hazards shall treat the performance of individual components as stochastically correlated due to the common engineering practice in the community including similarities in building design code, regulatory practices, construction materials, construction technologies, and the practices of local contractors. Modelling the spatially correlated damages of an infrastructure system with many components can be computationally expensive. This study addresses the scalability issue of risk analysis of large-scale systems by developing an interpolation technique. The basic idea is to sample a portion of components in the systems and evaluate their correlated damages accurately, while the damages of remaining components are interpolated from the sampled components. The new method can handle not only linear systems, but also systems with complex connectivity such as utility networks. Two examples are presented to demonstrate the proposed method, including cyclone loss assessment of the building portfolios in a virtual community, and connectivity analysis of an electric power system under a scenario cyclone event

    Formulation of Ice Resistance in Level Ice Using Double-Plates Superposition

    No full text
    The estimation of ship resistance in ice is a fundamental area of research and poses a substantial challenge for the design and safe use of ships in ice-covered waters. In order to estimate the ice resistance with greater reliability, we develop in this paper an improved Lindqvist formulation for the estimation of bending resistance in level ice based on the superposition of double-plates. In the developed method, an approximate model of an ice sheet is firstly presented by idealizing ice sheeta as the combination of a semi-infinite elastic plate and an infinite one resting on an elastic foundation. The Mohr–Coulomb criterion is then introduced to determine the ice sheet’s failure. Finally, an improved Lindqvist formulation for estimation of ice resistance is proposed. The accuracy of the developed formulation is validated using full-scale test data of the ship KV Svalbard in Norway, testing the model as well as the numerical method. The effect of ice thickness, stem angle and breadth of bow on ship resistance is further investigated by means of the developed formulation

    A new unbiased metamodel method for efficient reliability analysis

    Get PDF
    Metamodel method is widely used in structural reliability analysis. A main limitation of this method is that it is difficult or even impossible to quantify the model uncertainty caused by the metamodel approximation. This paper develops an improved metamodel method which is unbiased and highly efficient. The new method formulates a probability of failure as a product of a metamodel-based probability of failure and a correction term, which accounts for the approximation error due to metamodel approximation. The correction term is constructed and estimated using the Markov chain simulation. An iterative scheme is further developed to adaptively improve the accuracy of the metamodel and the associated correction term. The accuracy and efficiency of the new metamodel method is illustrated and compared with the classical Kriging metamodel and high dimensional model representation methods using a number of numerical and structural examples.National Natural Science Foundation of China Australian Research Counci
    corecore