104 research outputs found

    Can we verify and intrinsically validate risk assessment results? What progress is being made to increase QRA trustworthiness?

    Get PDF
    PresentationThe purpose of a risk assessment is to make a decision whether the risk of a given situation is acceptable, and, if not, how we can reduce it to a tolerable level. For many cases, this can be done in a semi-quantitative fashion. For more complex or problematic cases a quantitative approach is required. Anybody who has been involved in such a study is aware of the difficulties and pitfalls. Despite proven software many choices of parameters must be made and many uncertainties remain. The thoroughness of the study can make quite a difference in the result. Independently, analysts can arrive at results that differ orders of magnitude, especially if uncertainties are not included. Because for important decisions on capital projects there are always proponents and opponents, there is often a tense situation in which conflict is looming. The paper will first briefly review a standard procedure introduced for safety cases on products that must provide more or less a guarantee that the risk of use is below a certain value. Next will be the various approaches how to deal with uncertainties in a quantitative risk assessment and the follow-on decision process. Over the last few years several new developments have been made to achieve, to a certain extent, a hold on so-called deep uncertainty. Expert elicitation and its limitations is another aspect. The paper will be concluded with some practical recommendations

    Towards offshore wind digital twins:Application to jacket substructures

    Get PDF

    Opportunities and Challenges in Health Monitoring of Constructed Systems by Modal Analysis

    Get PDF
    Dynamic testing of constructed systems was initiated in the 1960’s by civil engineers interested in earthquake hazards mitigation research. During the 1970’s, mechanical engineers interested in experimental structural dynamics developed the art of modal analysis. More recently in the 1990’s, engineers from different disciplines have embarked on an exploration of health monitoring as a research area. The senior writer started research on dynamic testing of buildings and bridges during the 1970’s, and in the 1980’s collaborated with colleagues in mechanical engineering who were leading modal analysis research to transform and adapt modal analysis tools for structural identification of constructed systems. In the 1990’s the writer and his associates participated in the applications of the health monitoring concept to constructed systems. In this paper, the writers are interested in sharing their experiences in dynamic testing of large constructed systems, namely, MIMO impact testing as well as output-only modal analysis, in conjunction with associated laboratory studies. The writers will try to contribute to answering some questions that have been discussed in the modal analysis and health monitoring community for more than a decade: (a) What is the reliability of results from dynamic testing of constructed systems, (b) Can these tests serve for health monitoring of constructed systems? (c) Are there any additional benefits that may be expected from dynamic testing of constructed systems? (d) Best practices, constraints and future developments needed for a reliable implementation of MIMO testing and output-only modal analysis of constructed systems for health monitoring and other reasons

    System identification of constructed civil engineering structures and uncertainty

    Get PDF
    Characterization of constructed civil engineering structures through system identification has gained increasing attention in recent years due to its tremendous potential for optimum infrastructure asset management and performance-based engineering. However, the lack of reliability in system identification, especially when applied to large-scale complex constructed systems, poses a major challenge for its widespread implementation. It is believed that this primarily stems from epistemic uncertainty associated with identification processes, due to unknown or less understood structural behaviors as well as the interaction of the system with its environment. The objective of this thesis is to investigate the effects of epistemic uncertainty on the reliability of identification and to develop solutions to recognize and mitigate these uncertainties. The research which was undertaken included laboratory and field investigation as the primary components. First, a cantilever beam with two test configurations was designed and constructed in the laboratory as a test bed. By comparing different identification scenarios, the impact of modeling uncertainty with epistemic mechanism on the field-calibrated analytical model was evaluated. Feasible techniques were developed to recognize and mitigate significant epistemic modeling uncertainty which controls the test-analysis discrepancy. In applications of system identification on real-life structural systems, the tempo, frequency and spatial incompatibility between detailed finite element model and information contained in test measurements often further complicates the identification process. It was demonstrated through the Henry Hudson Bridge that it was possible to characterize the fundamental behaviors of largescale complex structures by integrating heuristics and conventional techniques. Measurements to assess the adequacy of the field-calibrated models were proposed to ensure that significant epistemic modeling uncertainty was efficiently reduced and critical physical mechaisms was properly conceptualized.Ph.D., Structural Engineering -- Drexel University, 200

    Landfill lining engineering designs: a probabilistic approach

    Get PDF
    Uncertainty and variability are prevalent in any engineering design. In this study, the uncertainty of input parameters for the stability of a landfill veneer cover soil and the integrity of a lining system were treated probabilistically using Monte Carlo simulation. Statistical information required to postulate the distribution types of input parameters, taken as random variables, were identified and characterised using available data from literature survey and a designed laboratory repeatability testing programme. The variability and uncertainty of interface shear strengths (Ï„) and the derived strength parameters for three generic interfaces, commonly found in a landfill lining system, were computed and compared using these types of information. The variability of Ï„ computed using the combined global database were three-to-five times and could reach up to seven times higher for the derived strength parameters when compared to laboratory repeatability test results. Additionally, a normal distribution was recommended for interface shear strengths and derived parameters (except interface adhesion with high COV) for good quality data based on subjective and objective statistical test methods. [Continues.

    Systems Engineering

    Get PDF
    The book "Systems Engineering: Practice and Theory" is a collection of articles written by developers and researches from all around the globe. Mostly they present methodologies for separate Systems Engineering processes; others consider issues of adjacent knowledge areas and sub-areas that significantly contribute to systems development, operation, and maintenance. Case studies include aircraft, spacecrafts, and space systems development, post-analysis of data collected during operation of large systems etc. Important issues related to "bottlenecks" of Systems Engineering, such as complexity, reliability, and safety of different kinds of systems, creation, operation and maintenance of services, system-human communication, and management tasks done during system projects are addressed in the collection. This book is for people who are interested in the modern state of the Systems Engineering knowledge area and for systems engineers involved in different activities of the area. Some articles may be a valuable source for university lecturers and students; most of case studies can be directly used in Systems Engineering courses as illustrative materials

    Statistical analysis of the error associated with the simplification of the stratigraphy in geotechnical models

    Get PDF
    The uncertainties in the determination of the stratigraphic profile of natural soils is one of the main problems in geotechnics, in particular for landslide characterization and modeling. The study deals with a new approach in geotechnical modeling which relays on a stochastic generation of different soil layers distributions, following a boolean logic – the method has been thus called BoSG (Boolean Stochastic Generation). In this way, it is possible to randomize the presence of a specific material interdigitated in a uniform matrix. In the building of a geotechnical model it is generally common to discard some stratigraphic data in order to simplify the model itself, assuming that the significance of the results of the modeling procedure would not be affected. With the proposed technique it is possible to quantify the error associated with this simplification. Moreover, it could be used to determine the most significant zones where eventual further investigations and surveys would be more effective to build the geotechnical model of the slope. The commercial software FLAC was used for the 2D and 3D geotechnical model. The distribution of the materials was randomized through a specifically coded MatLab program that automatically generates text files, each of them representing a specific soil configuration. Besides, a routine was designed to automate the computation of FLAC with the different data files in order to maximize the sample number. The methodology is applied with reference to a simplified slope in 2D, a simplified slope in 3D and an actual landslide, namely the Mortisa mudslide (Cortina d’Ampezzo, BL, Italy). However, it could be extended to numerous different cases, especially for hydrogeological analysis and landslide stability assessment, in different geological and geomorphological contexts.L’incertezza nella determinazione del profilo stratigrafico e dei parametri meccanici dei singoli terreni è tra i principali problemi dell’ingegneria geotecnica, in particolare per l’analisi dei fenomeni franosi. Lo studio presenta un nuovo approccio nella modellazione geotecnica che si basa sulla generazione stocastica di diverse distribuzioni di strati di terreno, seguendo una logica booleana - il metodo è stato perciò chiamato BoSG (Boolean Stochastic Generation – Generazione Stocastica Booleana). Con questo metodo è possibile randomizzare la presenza di uno specifico materiale interdigitato in una matrice uniforme. Nell’impostare un modello geotecnico, infatti, generalmente si eliminano alcuni dati stratigrafici per semplificare il modello stesso, assumendo che la significatività dei risultati non ne risenta. La metodologia proposta permette di quantificare l'errore associato a questa semplificazione. Inoltre, può essere utilizzata per determinare le zone più significativi nelle quali possibili ulteriori indagini geotecniche sarebbero più efficaci per la definizione del modello geotecnico. Per la modellizzazione bidimensionale e tridimensionale è stato utilizzato il software commerciale alle differenze finite FLAC. La distribuzione dei materiali è stata randomizzata attraverso un programma in MatLab specificamente codificato che genera automaticamente dei file di testo con le configurazioni del terreno. E’ stata inoltre programmata una routine per automatizzare il calcolo FLAC con diverse file di dati al fine di massimizzare la numerosità campionaria. In questa tesi la metodologia è stata applicata ad un pendio semplice in 2D, un pendio semplice in 3D e una frana reale: la frana di colata di Mortisa (Cortina d'Ampezzo, BL). Il metodo, tuttavia, potrebbe essere applicato ad altri casi, in particolare per studi di idrologia sotterranea, per l’analisi di stabilità di altre frane e in diversi contesti geologici e geomorfologic

    Advancing probabilistic risk assessment of offshore wind turbines on monopiles

    Get PDF
    Offshore Wind Turbines (OWTs) are a unique type of engineered structure. Their design spans all engineering disciplines, ranging from structural engineering for the substructure and foundation to electrical or mechanical engineering for the generating equipment. Consequently, the different components of an OWT are commonly designed independently using codified standards. Within the OWT design process, financial cost plays an important role as a constraint on decision making, because of the competition between prospective wind farm operators and with other forms of electricity generation. However, the current, independent design process does not allow for a combined assessment of OWT system financial loss. Nor does it allow for quantification of the uncertainties (e.g., wind and wave loading, materials properties) that characterise an OWT’s operations and which may have a strong impact on decision making. This thesis proposes quantifying financial losses associated with an OWT exposed to stochastic wind and wave conditions using a probabilistic risk modelling framework, as a first step towards evaluating Offshore Wind Farm (OWF) resilience. The proposed modelling framework includes a number of novel elements, including the development of site-specific fragility functions (relationships between the likelihood of different levels of damage experienced by an OWT over a range of hazard intensities), which account for uncertainties in both structural capacity and demands. As a further element of novelty, fragility functions are implemented in a closed-form assessment of financial loss, based on a combinatorial system reliability approach, which considers both structural and non-structural components. Two important structural performance objectives (or limit states) are evaluated in this thesis: 1) the Ultimate Limit State (ULS) which assesses the collapse of an OWT due to extreme wind and wave conditions, such as those resulting from hurricanes; and 2) the Fatigue Limit State (FLS), which addresses the cumulative effects of operational loading, i.e., cracks growing over the life of the structure until they threaten its integrity. This latter limit state is assessed using a novel machine learning technique, Gaussian Process (GP) regression, to develop a computationally-efficient surrogate model that emulates the output from computationally-expensive time-domain structural analyses. The consequence of the OWT failing is evaluated by computing annualised financial losses for the full OWT system. This provides a metric which is easily communicable to project stakeholders, and can also be used to compare the relative importance of different components and design strategies. Illustrative applications at case-study sites are presented as a walk-through of the calculation steps in the proposed framework and its various components. The calculation of losses provides a foundation from which a more detailed assessment of OWT and OWF resilience could be developed
    • …
    corecore