3,313 research outputs found

    An Efficient Multifidelity Model for Assessing Risk Probabilities in Power Systems under Rare Events

    Get PDF
    Risk assessment of power system failures induced by low-frequency, high-impact rare events is of paramount importance to power system planners and operators. In this paper, we develop a cost-effective multi-surrogate method based on multifidelity model for assessing risks in probabilistic power-flow analysis under rare events. Specifically, multiple polynomial-chaos-expansion-based surrogate models are constructed to reproduce power system responses to the stochastic changes of the load and the random occurrence of component outages. These surrogates then propagate a large number of samples at negligible computation cost and thus efficiently screen out the samples associated with high-risk rare events. The results generated by the surrogates, however, may be biased for the samples located in the low-probability tail regions that are critical to power system risk assessment. To resolve this issue, the original high-fidelity power system model is adopted to fine-tune the estimation results of low-fidelity surrogates by reevaluating only a small portion of the samples. This multifidelity model approach greatly improves the computational efficiency of the traditional Monte Carlo method used in computing the risk-event probabilities under rare events without sacrificing computational accuracy

    Projecting Climate Dependent Coastal Flood Risk With a Hybrid Statistical Dynamical Model

    Get PDF
    ABSTRACT: Numerical models for tides, storm surge, and wave runup have demonstrated ability to accurately define spatially varying flood surfaces. However these models are typically too computationally expensive to dynamically simulate the full parameter space of future oceanographic, atmospheric, and hydrologic conditions that will constructively compound in the nearshore to cause both extreme event and nuisance flooding during the 21st century. A surrogate modeling framework of waves, winds, and tides is developed in this study to efficiently predict spatially varying nearshore and estuarine water levels contingent on any combination of offshore forcing conditions. The surrogate models are coupled with a time-dependent stochastic climate emulator that provides efficient downscaling for hypothetical iterations of offshore conditions. Together, the hybrid statistical-dynamical framework can assess present day and future coastal flood risk, including the chronological characteristics of individual flood and wave-induced dune overtopping events and their changes into the future. The framework is demonstrated at Naval Base Coronado in San Diego, CA, utilizing the regional Coastal Storm Modeling System (CoSMoS; composed of Delft3D and XBeach) as the dynamic simulator and Gaussian process regression as the surrogate modeling tool. Validation of the framework uses both in-situ tide gauge observations within San Diego Bay, and a nearshore cross-shore array deployment of pressure sensors in the open beach surf zone. The framework reveals the relative influence of large-scale climate variability on future coastal flood resilience metrics relevant to the management of an open coast artificial berm, as well as the stochastic nature of future total water levels.This work was funded by the Strategic Environmental Research Development Program (DOD/SERDP RC-2644). Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. Government. F. J. Mendez, A. Rueda, and L. Cagigal acknowledge the partial funding from the Spanish Ministry of Science and Innovation, project Beach4cast PID2019-107053RB-I00. The authors thank the Scripps Center for Coastal Studies for their efforts to deploy, recover, and process surf zone pressure sensor data used as validation in this study. The authors thank Melisa Menendez for sharing GOW2 hindcast data for Southern California. The authors thank the sea-level rise projection authors for developing and making the sea-level rise projections available, multiple funding agencies for supporting the development of the projections, and the NASA Sea-Level Change Team for developing and hosting the IPCC AR6 Sea-Level Projection Tool

    Rare event probability estimation for groundwater inverse problems with a two-stage Sequential Monte Carlo approach

    Full text link
    Bayesian inversions followed by estimations of rare event probabilities are often needed to analyse groundwater hazards. Instead of focusing on the posterior distribution of model parameters, the main interest lies then in the distribution of a specific quantity of interest contingent upon these parameters. To address the associated methodological challenges, we introduce a two-stage Sequential Monte Carlo approach. In the first stage, it generates particles that approximate the posterior distribution; in the second stage, it employs subset sampling techniques to assess the probability of the rare event of interest. By considering two hydrogeological problems of increasing complexity, we showcase the efficiency and accuracy of the resulting PostRisk-SMC method for rare event probability estimation related to groundwater hazards. We compare the performance of the PostRisk-SMC method with a traditional Monte Carlo approach that relies on Markov chain Monte Carlo samples. We showcase that our estimates align with those of the traditional method, but the coefficients of variation are notably lower for the same computational budget when targeting more rare events. Furthermore, we highlight that the PostRisk-SMC method allows estimating rare event probabilities approaching one in a billion using less than one hundred thousand forward simulations. Even if the presented examples are related to groundwater hazards, the methodology is well-suited for addressing a wide range of topics in the geosciences and beyond

    An efficient framework for adequacy evaluation through extraction of rare load curtailment events in composite power systems

    Get PDF
    With the growing robustness of modern power systems, the occurrence of load curtailment events is becoming lower. Hence, the simulation of these events constitutes a challenge in adequacy indices assessment. Due to the rarity of the load curtailment events, the standard Monte Carlo simulation (MCS) estimator of adequacy indices is not practical. Therefore, a framework based on the enhanced cross-entropy-based importance sampling (ECE-IS) method is introduced in this paper for computing the adequacy indices. The framework comprises two stages. Using the proposed ECE-IS method, the first stage’s purpose is to identify the samples or states of the nodal generation and load that are greatly significant to the adequacy indices estimators. In the second stage, the density of the input variables’ conditional on the load curtailment domain obtained by the first stage are used to compute the nodal and system adequacy indices. The performance of the ECE-IS method is verified through a comparison with the standard MCS method and the recent techniques of rare events simulation in literature. The results confirm that the proposed method develops an accurate estimation for the nodal and system adequacy indices (loss of load probability (LOLP), expected power not supplied (EPNS)) with appropriate convergence value and low computation time

    INTEGRATED DETERMINISTIC AND PROBABILISTIC SAFETY ANALYSIS: CONCEPTS, CHALLENGES, RESEARCH DIRECTIONS

    No full text
    International audienceIntegrated deterministic and probabilistic safety analysis (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives

    Efficient resilience analysis and decision-making for complex engineering systems

    Get PDF
    Modern societies around the world are increasingly dependent on the smooth functionality of progressively more complex systems, such as infrastructure systems, digital systems like the internet, and sophisticated machinery. They form the cornerstones of our technologically advanced world and their efficiency is directly related to our well-being and the progress of society. However, these important systems are constantly exposed to a wide range of threats of natural, technological, and anthropogenic origin. The emergence of global crises such as the COVID-19 pandemic and the ongoing threat of climate change have starkly illustrated the vulnerability of these widely ramified and interdependent systems, as well as the impossibility of predicting threats entirely. The pandemic, with its widespread and unexpected impacts, demonstrated how an external shock can bring even the most advanced systems to a standstill, while the ongoing climate change continues to produce unprecedented risks to system stability and performance. These global crises underscore the need for systems that can not only withstand disruptions, but also, recover from them efficiently and rapidly. The concept of resilience and related developments encompass these requirements: analyzing, balancing, and optimizing the reliability, robustness, redundancy, adaptability, and recoverability of systems -- from both technical and economic perspectives. This cumulative dissertation, therefore, focuses on developing comprehensive and efficient tools for resilience-based analysis and decision-making of complex engineering systems. The newly developed resilience decision-making procedure is at the core of these developments. It is based on an adapted systemic risk measure, a time-dependent probabilistic resilience metric, as well as a grid search algorithm, and represents a significant innovation as it enables decision-makers to identify an optimal balance between different types of resilience-enhancing measures, taking into account monetary aspects. Increasingly, system components have significant inherent complexity, requiring them to be modeled as systems themselves. Thus, this leads to systems-of-systems with a high degree of complexity. To address this challenge, a novel methodology is derived by extending the previously introduced resilience framework to multidimensional use cases and synergistically merging it with an established concept from reliability theory, the survival signature. The new approach combines the advantages of both original components: a direct comparison of different resilience-enhancing measures from a multidimensional search space leading to an optimal trade-off in terms of system resilience, and a significant reduction in computational effort due to the separation property of the survival signature. It enables that once a subsystem structure has been computed -- a typically computational expensive process -- any characterization of the probabilistic failure behavior of components can be validated without having to recompute the structure. In reality, measurements, expert knowledge, and other sources of information are loaded with multiple uncertainties. For this purpose, an efficient method based on the combination of survival signature, fuzzy probability theory, and non-intrusive stochastic simulation (NISS) is proposed. This results in an efficient approach to quantify the reliability of complex systems, taking into account the entire uncertainty spectrum. The new approach, which synergizes the advantageous properties of its original components, achieves a significant decrease in computational effort due to the separation property of the survival signature. In addition, it attains a dramatic reduction in sample size due to the adapted NISS method: only a single stochastic simulation is required to account for uncertainties. The novel methodology not only represents an innovation in the field of reliability analysis, but can also be integrated into the resilience framework. For a resilience analysis of existing systems, the consideration of continuous component functionality is essential. This is addressed in a further novel development. By introducing the continuous survival function and the concept of the Diagonal Approximated Signature as a corresponding surrogate model, the existing resilience framework can be usefully extended without compromising its fundamental advantages. In the context of the regeneration of complex capital goods, a comprehensive analytical framework is presented to demonstrate the transferability and applicability of all developed methods to complex systems of any type. The framework integrates the previously developed resilience, reliability, and uncertainty analysis methods. It provides decision-makers with the basis for identifying resilient regeneration paths in two ways: first, in terms of regeneration paths with inherent resilience, and second, regeneration paths that lead to maximum system resilience, taking into account technical and monetary factors affecting the complex capital good under analysis. In summary, this dissertation offers innovative contributions to efficient resilience analysis and decision-making for complex engineering systems. It presents universally applicable methods and frameworks that are flexible enough to consider system types and performance measures of any kind. This is demonstrated in numerous case studies ranging from arbitrary flow networks, functional models of axial compressors to substructured infrastructure systems with several thousand individual components.Moderne Gesellschaften sind weltweit zunehmend von der reibungslosen Funktionalität immer komplexer werdender Systeme, wie beispielsweise Infrastruktursysteme, digitale Systeme wie das Internet oder hochentwickelten Maschinen, abhängig. Sie bilden die Eckpfeiler unserer technologisch fortgeschrittenen Welt, und ihre Effizienz steht in direktem Zusammenhang mit unserem Wohlbefinden sowie dem Fortschritt der Gesellschaft. Diese wichtigen Systeme sind jedoch einer ständigen und breiten Palette von Bedrohungen natürlichen, technischen und anthropogenen Ursprungs ausgesetzt. Das Auftreten globaler Krisen wie die COVID-19-Pandemie und die anhaltende Bedrohung durch den Klimawandel haben die Anfälligkeit der weit verzweigten und voneinander abhängigen Systeme sowie die Unmöglichkeit einer Gefahrenvorhersage in voller Gänze eindrücklich verdeutlicht. Die Pandemie mit ihren weitreichenden und unerwarteten Auswirkungen hat gezeigt, wie ein externer Schock selbst die fortschrittlichsten Systeme zum Stillstand bringen kann, während der anhaltende Klimawandel immer wieder beispiellose Risiken für die Systemstabilität und -leistung hervorbringt. Diese globalen Krisen unterstreichen den Bedarf an Systemen, die nicht nur Störungen standhalten, sondern sich auch schnell und effizient von ihnen erholen können. Das Konzept der Resilienz und die damit verbundenen Entwicklungen umfassen diese Anforderungen: Analyse, Abwägung und Optimierung der Zuverlässigkeit, Robustheit, Redundanz, Anpassungsfähigkeit und Wiederherstellbarkeit von Systemen -- sowohl aus technischer als auch aus wirtschaftlicher Sicht. In dieser kumulativen Dissertation steht daher die Entwicklung umfassender und effizienter Instrumente für die Resilienz-basierte Analyse und Entscheidungsfindung von komplexen Systemen im Mittelpunkt. Das neu entwickelte Resilienz-Entscheidungsfindungsverfahren steht im Kern dieser Entwicklungen. Es basiert auf einem adaptierten systemischen Risikomaß, einer zeitabhängigen, probabilistischen Resilienzmetrik sowie einem Gittersuchalgorithmus und stellt eine bedeutende Innovation dar, da es Entscheidungsträgern ermöglicht, ein optimales Gleichgewicht zwischen verschiedenen Arten von Resilienz-steigernden Maßnahmen unter Berücksichtigung monetärer Aspekte zu identifizieren. Zunehmend weisen Systemkomponenten eine erhebliche Eigenkomplexität auf, was dazu führt, dass sie selbst als Systeme modelliert werden müssen. Hieraus ergeben sich Systeme aus Systemen mit hoher Komplexität. Um diese Herausforderung zu adressieren, wird eine neue Methodik abgeleitet, indem das zuvor eingeführte Resilienzrahmenwerk auf multidimensionale Anwendungsfälle erweitert und synergetisch mit einem etablierten Konzept aus der Zuverlässigkeitstheorie, der Überlebenssignatur, zusammengeführt wird. Der neue Ansatz kombiniert die Vorteile beider ursprünglichen Komponenten: Einerseits ermöglicht er einen direkten Vergleich verschiedener Resilienz-steigernder Maßnahmen aus einem mehrdimensionalen Suchraum, der zu einem optimalen Kompromiss in Bezug auf die Systemresilienz führt. Andererseits ermöglicht er durch die Separationseigenschaft der Überlebenssignatur eine signifikante Reduktion des Rechenaufwands. Sobald eine Subsystemstruktur berechnet wurde -- ein typischerweise rechenintensiver Prozess -- kann jede Charakterisierung des probabilistischen Ausfallverhaltens von Komponenten validiert werden, ohne dass die Struktur erneut berechnet werden muss. In der Realität sind Messungen, Expertenwissen sowie weitere Informationsquellen mit vielfältigen Unsicherheiten belastet. Hierfür wird eine effiziente Methode vorgeschlagen, die auf der Kombination von Überlebenssignatur, unscharfer Wahrscheinlichkeitstheorie und nicht-intrusiver stochastischer Simulation (NISS) basiert. Dadurch entsteht ein effizienter Ansatz zur Quantifizierung der Zuverlässigkeit komplexer Systeme unter Berücksichtigung des gesamten Unsicherheitsspektrums. Der neue Ansatz, der die vorteilhaften Eigenschaften seiner ursprünglichen Komponenten synergetisch zusammenführt, erreicht eine bedeutende Verringerung des Rechenaufwands aufgrund der Separationseigenschaft der Überlebenssignatur. Er erzielt zudem eine drastische Reduzierung der Stichprobengröße aufgrund der adaptierten NISS-Methode: Es wird nur eine einzige stochastische Simulation benötigt, um Unsicherheiten zu berücksichtigen. Die neue Methodik stellt nicht nur eine Neuerung auf dem Gebiet der Zuverlässigkeitsanalyse dar, sondern kann auch in das Resilienzrahmenwerk integriert werden. Für eine Resilienzanalyse von real existierenden Systemen ist die Berücksichtigung kontinuierlicher Komponentenfunktionalität unerlässlich. Diese wird in einer weiteren Neuentwicklung adressiert. Durch die Einführung der kontinuierlichen Überlebensfunktion und dem Konzept der Diagonal Approximated Signature als entsprechendes Ersatzmodell kann das bestehende Resilienzrahmenwerk sinnvoll erweitert werden, ohne seine grundlegenden Vorteile zu beeinträchtigen. Im Kontext der Regeneration komplexer Investitionsgüter wird ein umfassendes Analyserahmenwerk vorgestellt, um die Übertragbarkeit und Anwendbarkeit aller entwickelten Methoden auf komplexe Systeme jeglicher Art zu demonstrieren. Das Rahmenwerk integriert die zuvor entwickelten Methoden der Resilienz-, Zuverlässigkeits- und Unsicherheitsanalyse. Es bietet Entscheidungsträgern die Basis für die Identifikation resilienter Regenerationspfade in zweierlei Hinsicht: Zum einen im Sinne von Regenerationspfaden mit inhärenter Resilienz und zum anderen Regenerationspfade, die zu einer maximalen Systemresilienz unter Berücksichtigung technischer und monetärer Einflussgrößen des zu analysierenden komplexen Investitionsgutes führen. Zusammenfassend bietet diese Dissertation innovative Beiträge zur effizienten Resilienzanalyse und Entscheidungsfindung für komplexe Ingenieursysteme. Sie präsentiert universell anwendbare Methoden und Rahmenwerke, die flexibel genug sind, um beliebige Systemtypen und Leistungsmaße zu berücksichtigen. Dies wird in zahlreichen Fallstudien von willkürlichen Flussnetzwerken, funktionalen Modellen von Axialkompressoren bis hin zu substrukturierten Infrastruktursystemen mit mehreren tausend Einzelkomponenten demonstriert

    OPERATIONAL RELIABILITY AND RISK EVALUATION FRAMEWORKS FOR SUSTAINABLE ELECTRIC POWER SYSTEMS

    Get PDF
    Driven by a confluence of multiple environmental, social, technical, and economic factors, traditional electric power systems are undergoing a momentous transition toward sustainable electric power systems. One of the important facets of this transformation is the inclusion of high penetration of variable renewable energy sources, the chief among them being wind power. The new source of uncertainty that stems from imperfect wind power forecasts, coupled with the traditional uncertainties in electric power systems, such as unplanned component outages, introduces new challenges for power system operators. In particular, the short-term or operational reliability of sustainable electric power systems could be at increased risk as limited remedial resources are available to the operators to handle uncertainties and outages during system operation. Furthermore, as sustainable electric power systems and natural gas networks become increasingly coupled, the impacts of outages in one network can quickly propagate into the other, thereby reducing the operational reliability of integrated electric power-gas networks (IEPGNs). In light of the above discussion, a successful transition to sustainable electric power systems necessitates a new set of tools to assist the power system operators to make risk-informed decisions amid multiple sources of uncertainties. Such tools should be able to realistically evaluate the hour- and day-ahead operational reliability and risk indices of sustainable electric power systems in a computationally efficient manner while giving full attention to the uncertainties of wind power and IEGPNs. To this end, the research is conducted on five related topics. First, a simulation-based framework is proposed to evaluate the operational reliability indices of generating systems using the fixed-effort generalized splitting approach. Simulations show improvement in computational performance when compared to the traditional Monte-Carlo simulation (MCS). Second, a hybrid analytical-simulation framework is proposed for the short-term risk assessment of wind-integrated power systems. The area risk method – an analytical technique, is combined with the importance sampling (IS)-based MCS to integrate the proposed reliability models of wind speed and calculate the risk indices with a low computational burden. Case studies validate the efficacy of the proposed framework. Third, the importance sampling-based MCS framework is extended to include the proposed data-driven probabilistic models of wind power to avoid the drawbacks of wind speed models. Fourth, a comprehensive framework for the operational reliability evaluation of IEPGNs is developed. This framework includes new reliability models for natural gas pipelines and natural gas-fired generators with dual fuel capabilities. Simulations show the importance of considering the coupling between the two networks while evaluating operational reliability indices. Finally, a new chance-constrained optimization model to consider the operational reliability constraints while determining the optimal operational schedule for microgrids is proposed. Case studies show the tradeoff between the reliability and the operating costs when scheduling the microgrids

    Dynamic structural health monitoring for concrete gravity dams based on the Bayesian inference

    Get PDF
    The preservation of concrete dams is a key issue for researchers and practitioners in dam engineering because of the important role played by these infrastructures in the sustainability of our society. Since most of existing concrete dams were designed without considering their dynamic behaviour, monitoring their structural health is fundamental in achieving proper safety levels. Structural Health Monitoring systems based on ambient vibrations are thus crucial. However, the high computational burden related to numerical models and the numerous uncertainties affecting the results have so far prevented structural health monitoring systems for concrete dams from being developed. This study presents a framework for the dynamic structural health monitoring of concrete gravity dams in the Bayesian setting. The proposed approach has a relatively low computational burden, and detects damage and reduces uncertainties in predicting the structural behaviour of dams, thus improving the reliability of the structural health monitoring system itself. The application of the proposed procedure to an Italian concrete gravity dam demonstrates its feasibility in real cases
    corecore