20,843 research outputs found

    A new adaptive response surface method for reliability analysis

    Get PDF
    Response surface method is a convenient tool to assess reliability for a wide range of structural mechanical problems. More specifically, adaptive schemes which consist in iteratively refine the experimental design close to the limit state have received much attention. However, it is generally difficult to take into account a lot of variables and to well handle approximation error. The method, proposed in this paper, addresses these points using sparse response surface and a relevant criterion for results accuracy. For this purpose, a response surface is built from an initial Latin Hypercube Sampling (LHS) where the most significant terms are chosen from statistical criteria and cross-validation method. At each step, LHS is refined in a region of interest defined with respect to an importance level on probability density in the design point. Two convergence criteria are used in the procedure: The first one concerns localization of the region and the second one the response surface quality. Finally, a bootstrap method is used to determine the influence of the response error on the estimated probability of failure. This method is applied to several examples and results are discussed

    Probabilistic structural mechanics research for parallel processing computers

    Get PDF
    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical

    An efficient methodology to estimate probabilistic seismic damage curves

    Get PDF
    The incremental dynamic analysis (IDA) is a powerful methodology that can be easily extended for calculating probabilistic seismic damage curves. These curves are metadata to assess the seismic risk of structures. Although this methodology requires a relevant computational effort, it should be the reference to correctly estimate the seismic risk of structures. Nevertheless, it would be of high practical interest to have a simpler methodology, based for instance on the pushover analysis (PA), to obtain similar results to those based on IDA. In this article, PA is used to obtain probabilistic seismic damage curves from the stiffness degradation and the energy of the nonlinear part of the capacity curve. A fully probabilistic methodology is tackled by means of Monte Carlo simulations with the purpose of establishing that the results based on the simplified proposed approach are compatible with those obtained with the IDA. Comparisons between the results of both approaches are included for a low- to midrise reinforced concrete building. The proposed methodology significantly reduces the computational effort when calculating probabilistic seismic damage curves.Peer ReviewedPostprint (author's final draft

    Synergistic combination of systems for structural health monitoring and earthquake early warning for structural health prognosis and diagnosis

    Get PDF
    Earthquake early warning (EEW) systems are currently operating nationwide in Japan and are in beta-testing in California. Such a system detects an earthquake initiation using online signals from a seismic sensor network and broadcasts a warning of the predicted location and magnitude a few seconds to a minute or so before an earthquake hits a site. Such a system can be used synergistically with installed structural health monitoring (SHM) systems to enhance pre-event prognosis and post-event diagnosis of structural health. For pre-event prognosis, the EEW system information can be used to make probabilistic predictions of the anticipated damage to a structure using seismic loss estimation methodologies from performance-based earthquake engineering. These predictions can support decision-making regarding the activation of appropriate mitigation systems, such as stopping traffic from entering a bridge that has a predicted high probability of damage. Since the time between warning and arrival of the strong shaking is very short, probabilistic predictions must be rapidly calculated and the decision making automated for the mitigation actions. For post-event diagnosis, the SHM sensor data can be used in Bayesian updating of the probabilistic damage predictions with the EEW predictions as a prior. Appropriate Bayesian methods for SHM have been published. In this paper, we use pre-trained surrogate models (or emulators) based on machine learning methods to make fast damage and loss predictions that are then used in a cost-benefit decision framework for activation of a mitigation measure. A simple illustrative example of an infrastructure application is presented

    Scoping study on the significance of mesh resolution vs. scenario uncertainty in the CFD modelling of residential smoke control systems

    Get PDF
    Computational fluid dynamics (CFD) modelling is a commonly applied tool adopted to support the specification and design of common corridor ventilation systems in UK residential buildings. Inputs for the CFD modelling of common corridor ventilation systems are typically premised on a ‘reasonable worst case’, i.e. no specific uncertainty quantification process is undertaken to evaluate the safety level. As such, where the performance of a specific design sits on a probability spectrum is not defined. Furthermore, mesh cell sizes adopted are typically c. 100 – 200 mm. For a large eddy simulation (LES) based CFD code, this is considered coarse for this application and creates a further uncertainty in respect of capturing key behaviours in the CFD model. Both co-existing practices summarised above create uncertainty, either due to parameter choice or the (computational fire and smoke) model. What is not clear is the relative importance of these uncertainties. This paper summarises a scoping study that subjects the noted common corridor CFD application to a probabilistic risk assessment (PRA), using the MaxEnt method. The uncertainty associated with the performance of a reference design is considered at different grid scales (achieving different ‘a posteriori’ mesh quality indicators), with the aim of quantifying the relative importance of uncertainties associated with inputs and scenarios, vs. the fidelity of the CFD model. For the specific case considered herein, it is found that parameter uncertainty has a more significant impact on the confidence of a given design solution relative to that arising from grid resolution, for grid sizes of 100 mm or less. Above this grid resolution, it was found that uncertainty associated with the model dictates. Given the specific ventilation arrangement modelled in this work care should be undertaken in generalising such conclusions
    corecore