162 research outputs found

    A Bayesian model updating procedure for dynamic health monitoring

    Get PDF
    Structures under dynamic excitation can undergo phenomena of crack growth and fracture. For safety reasons, it is of key importance to be able to detect and classify these cracks before the unwarned structural failure. Additionally, the cracks will also change the dynamic behaviour of the structures, impacting their performance. Here, a Bayesian model updating procedure has been implemented for the crack detection location and length estimation on a numerical model of a spring suspension arm. A highfidelity finite element model has been used to simulate experimental data, by inserting cracks of different extent at different locations and obtaining reference frequency response functions. In the following, a low fidelity parametric model has been used in the Bayesian framework to infer the crack location and length by comparing the dynamic responses. It is shown that the proposed methodology can be successfully adopted as a structural health monitoring tool

    Sensitivity Analysis of Material and Load Parameters to Fatigue Stresses of an Offshore Wind Turbine Monopile Substructure

    Get PDF
    Steel monopiles are support structures mostly applied for offshore wind turbines. Their installation is straightforward, in particular, in shallow and medium waters. While the wind turbine tower is primarily affected by wind, the wave loads are dominant for the monopile, as it is submerged to a large extent. This study deals with the influence of uncertainties in material and load parameters on the behaviour of those structures. It is investigated how the scattering of material properties (namely Young's modulus of elasticity) affect the structural response. In addition, loads with different characteristics are applied, and it is examined how the changes in loads influence the structural response. The analysed output data of interest are the extreme stresses leading to the accumulation of fatigue damage. In order for a realistic modelling, wave loads are considered with irregular sea states with different wave characteristics (significant wave heights and wave peak periods). The final aim of the analysis is to classify the effects of specific wave characteristics on the stresses by means of a sensitivity analysis. The analysis shows that variations in the wave peak period have the strongest influence on stress outputs. This effect results from the strong sensitivity of the structural dynamical response to the decrease of the difference between the values of the wave peak frequency and the natural frequencies of the structure. © 2017 The Authors. Published by Elsevier Ltd

    Model Updating Strategy of the DLR-AIRMOD Test Structure

    Get PDF
    Considerable progresses have been made in computer-aided engineering for the high fidelity analysis of structures and systems. Traditionally, computer models are calibrated using deterministic procedures. However, different analysts produce different models based on different modelling approximations and assumptions. In addition, identically constructed structures and systems show different characteristic between each other. Hence, model updating needs to take account modelling and test-data variability. Stochastic model updating techniques such as sensitivity approach and Bayesian updating are now recognised as powerful approaches able to deal with unavoidable uncertainty and variability. This paper presents a high fidelity surrogate model that allows to significantly reduce the computational costs associated with the Bayesian model updating technique. A set of Artificial Neural Networks are proposed to replace multi non-linear input-output relationships of finite element (FE) models. An application for updating the model parameters of the FE model of the DRL-AIRMOD structure is presented. © 2017 The Authors. Published by Elsevier Ltd

    Resilience decision-making for complex systems

    Get PDF
    Complex systems-such as gas turbines, industrial plants, and infrastructure networks- are of paramount importance to modern societies. However, these systems are subject to various threats. Novel research does not only focus on monitoring and improving the robustness and reliability of systems but also focus on their recovery from adverse events. The concept of resilience encompasses these developments. Appropriate quantitative measures of resilience can support decision-makers seeking to improve or to design complex systems. In this paper, we develop comprehensive and widely adaptable instruments for resilience-based decision-making. Integrating an appropriate resilience metric together with a suitable systemic risk measure, we design numerically efficient tools aiding decision-makers in balancing different resilience-enhancing investments. The approach allows for a direct comparison between failure prevention arrangements and recovery improvement procedures, leading to optimal tradeoffs with respect to the resilience of a system. In addition, the method is capable of dealing with the monetary aspects involved in the decision-making process. Finally, a grid search algorithm for systemic risk measures significantly reduces the computational effort. In order to demonstrate its wide applicability, the suggested decision-making procedure is applied to a functional model of a multistage axial compressor, and to the U-Bahn and S-Bahn system of Germany's capital Berlin. Copyright © 2020 by ASME

    Mapping slope deposits depth by means of cluster analysis: a comparative assessment

    Get PDF
    In this work a comparison among slope deposits (SD) maps obtained by integrating field measurements of SD depth and cluster analysis of morphometric data has been performed. Three SD depth maps have been obtained for the same area (SA1) by using different approaches. Two maps have been achieved by implementing both the supervised and unsupervised approaches and exploiting the dataset of SD depths previously collected in a region (SA2) characterized by the same bedrock lithology, although located 35 km far from the SA1. The results have been validated against a reference map based on SD depth measurements acquired during this work within the SA1 and mapped by unsupervised clustering. The outcome of the study shows the feasibility of the methodology proposed to obtain depth maps of SD. Nevertheless the very low map accuracy suggests that relationships among main morphometric variables and slope deposits depth are not constant at regional scale, although considering areas characterized by the same bedrock lithology. Hence, maps of SD depth should be based on depth data specifically acquired within the area under study. In order to improve the exploitation of SD depth datasets outside their provenance area, further research are necessary on clustering algorithms performance as well as additional morphometric and environmental variables to be employed in spatial analysis

    The role of the Bhattacharyya distance in stochastic model updating

    Get PDF
    The Bhattacharyya distance is a stochastic measurement between two samples and taking into account their probability distributions. The objective of this work is to further generalize the application of the Bhattacharyya distance as a novel uncertainty quantification metric by developing an approximate Bayesian computation model updating framework, in which the Bhattacharyya distance is fully embedded. The Bhattacharyya distance between sample sets is evaluated via a binning algorithm. And then the approximate likelihood function built upon the concept of the distance is developed in a two-step Bayesian updating framework, where the Euclidian and Bhattacharyya distances are utilized in the first and second steps, respectively. The performance of the proposed procedure is demonstrated with two exemplary applications, a simulated mass-spring example and a quite challenging benchmark problem for uncertainty treatment. These examples demonstrate a gain in quality of the stochastic updating by utilizing the superior features of the Bhattacharyya distance, representing a convenient, efficient, and capable metric for stochastic model updating and uncertainty characterization

    Comparison of Bayesian and interval uncertainty quantification : application to the AIRMOD test structure

    Get PDF
    This paper concerns the comparison of two inverse methods for the quantification of uncertain model parameters, based on experimentally obtained measurement data of the model's responses. Specifically, Bayesian inference is compared to a novel method for the quantification of multivariate interval uncertainty. The comparison is made by applying both methods to the AIRMOD measurement data set, and comparing their results critically in terms of obtained information and computational expense. Since computational cost of the application of both methods to high-dimensional problems and realistic numerical models can become intractable, an Artificial Neural Network surrogate is used for both methods. The application of this ANN proves to limit the computational cost to a large extent, even taking the generation of the training dataset into account. Concerning the comparison of both methods, it is found that the results of the Bayesian identification provide less over-conservative bounds on the uncertainty in the responses of the AIRMOD model

    Do we have enough data? Robust reliability via uncertainty quantification

    Get PDF
    \u3cp\u3eA generalised probabilistic framework is proposed for reliability assessment and uncertainty quantification under a lack of data. The developed computational tool allows the effect of epistemic uncertainty to be quantified and has been applied to assess the reliability of an electronic circuit and a power transmission network. The strength and weakness of the proposed approach are illustrated by comparison to traditional probabilistic approaches. In the presence of both aleatory and epistemic uncertainty, classic probabilistic approaches may lead to misleading conclusions and a false sense of confidence which may not fully represent the quality of the available information. In contrast, generalised probabilistic approaches are versatile and powerful when linked to a computational tool that permits their applicability to realistic engineering problems.\u3c/p\u3
    corecore