534 research outputs found

    Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    Full text link

    An Adaptive Simulation Framework for the Exploration of Extreme and Unexpected Events in Dynamic Engineered Systems

    Get PDF
    open3noThe end states reached by an engineered system during an accident scenario depend not only on the sequences of the events composing the scenario, but also on their timing and magnitudes. Including these additional features within an overarching framework can render the analysis infeasible in practical cases, due to the high dimension of the system state-space and the computational effort correspondingly needed to explore the possible system evolutions in search of the interesting (and very rare) ones of failure. To tackle this hurdle, in this article we introduce a framework for efficiently probing the space of event sequences of a dynamic system by means of a guided Monte Carlo simulation. Such framework is semi-automatic and allows embedding the analyst's prior knowledge about the system and his/her objectives of analysis. Specifically, the framework allows adaptively and intelligently allocating the simulation efforts preferably on those sequences leading to outcomes of interest for the objectives of the analysis, e.g., typically those that are more safety-critical (and/or rare). The emerging diversification in the filling of the state-space by the preference-guided exploration allows also the retrieval of critical system features, which can be useful to analysts and designers for taking appropriate means of prevention and mitigation of dangerous and/or unexpected consequences. A dynamic system for gas transmission is considered as a case study to demonstrate the application of the method.openTurati, Pietro; Pedroni, Nicola; Zio, EnricoTurati, Pietro; Pedroni, Nicola; Zio, Enric

    Bayesian Network Representing System Dynamics in Risk Analysis of Nuclear Systems.

    Full text link
    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have calculated the core damage probably as a function of transient time. The use of the DBN model in combination with ACE allows risk analysis to be performed with much less effort than if the analysis were done using the standard techniques.Ph.D.Nuclear Engineering & Radiological SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/89759/1/avarutta_1.pd

    Uncertainty Quantification for Reactor Safety Analysis.

    Full text link
    The present work developed new methodologies based on code surrogates and deterministic sampling strategies for uncertainty quantification (UQ) of nuclear power plant (NPP) transients in reactor safety analysis. These methodologies take advantage of and efficiently use the additional computational resources available to perform more simulations with system thermal-hydraulic (TH) codes obtaining additional and more reliable uncertainty information compared to conventional UQ methods used in reactor safety analysis. The methodologies were demonstrated for a Best Estimate Plus Uncertainty (BEPU) licensing calculation and the analysis of a dynamic event tree (DET) for a realistic NPP transient. The first methodology uses the Alternating Conditional Expectation (ACE) algorithm, a powerful nonparametric regression technique, to develop a dynamic code surrogate that can accurately simulate time dependent, nonlinear TH behavior of a NPP transient considering multiple safety system degradations or failures. A surrogate taking the form of a discrete time dynamic system model with four input parameters and a recursive relationship was developed to predict the subcooled water level in a reactor core during the recirculation phase of a hot leg large-break loss-of-coolant accident (HL-LBLOCA). The model uncertainty of the of the ACE surrogate was derived and the unscented transform (UT), a sampling based UQ method, was used to propagate model uncertainty in the surrogate predictions. The second methodology demonstrates the applicability of the UT as a general, sampling based UQ methodology. The UT uses a deterministic sampling algorithm to obtain estimates of the mean and variance of the output parameter of interest with significantly smaller sample sizes opposed to random sampling schemes. The primary advantage of the UT is the size of the UT sample determining the computational expense of the method scales linearly with the size of the input parameter space. Linear scaling keeps the simulation of large complex systems computationally manageable compared to geometric scaling, a common constraint in DET analysis of NPPs.PhDNuclear Engineering & Radiological SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/107104/1/dfynan_1.pd

    e-Sanctuary: open multi-physics framework for modelling wildfire urban evacuation

    Get PDF
    The number of evacuees worldwide during wildfire keep rising, year after year. Fire evacuations at the wildland-urban interfaces (WUI) pose a serious challenge to fire and emergency services and are a global issue affecting thousands of communities around the world. But to date, there is a lack of comprehensive tools able to inform, train or aid the evacuation response and the decision making in case of wildfire. The present work describes a novel framework for modelling wildfire urban evacuations. The framework is based on multi-physics simulations that can quantify the evacuation performance. The work argues that an integrated approached requires considering and integrating all three important components of WUI evacuation, namely: fire spread, pedestrian movement, and traffic movement. The report includes a systematic review of each model component, and the key features needed for the integration into a comprehensive toolkit

    A Predictive Model of Nuclear Power Plant Crew Decision-Making and Performance in a Dynamic Simulation Environment

    Get PDF
    The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities

    APPLICATION OF AN ARTIFICIAL INTELLIGENCE-ENABLED REAL-TIME WARGAMING SYSTEM FOR NAVAL TACTICAL OPERATIONS

    Get PDF
    The Navy is taking advantage of advances in computational technologies and data analytic methods to automate and enhance tactical decisions to support warfighters in highly complex combat environments. Novel automated techniques offer opportunities for tactical warfighter support through enhanced situational awareness, automated reasoning and problem-solving, and faster decision timelines. This capstone project investigated the use of artificial Intelligence and game theory to develop real-time wargaming capabilities to enhance warfighters in their ability to explore and evaluate the possible consequences of different tactical COAs to improve tactical missions. This project applied a systems analysis approach and developed a conceptual design of a wargaming real-time Artificial Intelligence decision-aid (WRAID) system capability to support the future tactical warfighter. An operational scenario was developed and used to conduct an operational analysis of the WRAID capability. The project identified requirements for the future WRAID capabilities and studied implementation challenges (including ethical) that will need to be addressedNPS Naval Research ProgramThis project was funded in part by the NPS Naval Research Program.Civilian, DoD, NUWCNPTCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyApproved for public release. Distribution is unlimited

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Quantification and mitigation of the impacts of extreme weather on power system resilience and reliability

    Get PDF
    Modelling the impact of extreme weather on power systems is a computationally expensive, challenging area of study due to the diversity of threats, complicatedness of modelling, and data and simulation requirements to perform the relevant studies. The impacts of extreme weather – specifically wind – are considered. Factors such as the distribution of outage probability on lines and the potential correlation with wind power generation during storms are investigated; so too is sensitivity of security assessments involving extreme wind to the relationships used between failures and the natural hazard being studied, specifically wind speed. A large scale simulation ensemble is developed and demonstrated to investigate what are deemed the most significant features of power system simulation during extreme weather events. The challenges associated with modelling high impact low probability (HILP) events are studied and demonstrate that the results of security assessments are significantly affected by the granularity of incident weather data being used and the corrections or interpolation being applied to the source data. A generalizable simulation framework is formulated and deployed to investigate the significance of the relationship between incident natural hazards, in this case wind, and its corresponding impact on system resilience. Based on this, a large-scale simulation model is developed and demonstrated to take consideration of a wide variety of factors which can affect power systems during extreme weather events including, but not limited to, under frequency load shedding, line overloads, and high wind speed shutdown and its impact on wind generation. A methodology for quantifying and visualising distributed overhead line failure risk is also demonstrated in tandem with straightforward methods for making wind power projections over transmission systems for security studies. The potential correlation between overhead line risk and wind power generation risk is illustrated visually on representations of GB power networks based on real world data.Open Acces

    3rd International Conference on Advanced Research Methods and Analytics (CARMA 2020)

    Full text link
    Research methods in economics and social sciences are evolving with the increasing availability of Internet and Big Data sources of information.As these sources, methods, and applications become more interdisciplinary, the 3rd International Conference on Advanced Research Methods and Analytics (CARMA) is an excellent forum for researchers and practitioners to exchange ideas and advances on how emerging research methods and sources are applied to different fields of social sciences as well as to discuss current and future challenges.Doménech I De Soria, J.; Vicente Cuervo, MR. (2020). 3rd International Conference on Advanced Research Methods and Analytics (CARMA 2020). Editorial Universitat Politècnica de València. http://hdl.handle.net/10251/149510EDITORIA
    • …
    corecore