2,007 research outputs found

    Human reliability analysis: exploring the intellectual structure of a research field

    Get PDF
    Humans play a crucial role in modern socio-technical systems. Rooted in reliability engineering, the discipline of Human Reliability Analysis (HRA) has been broadly applied in a variety of domains in order to understand, manage and prevent the potential for human errors. This paper investigates the existing literature pertaining to HRA and aims to provide clarity in the research field by synthesizing the literature in a systematic way through systematic bibliometric analyses. The multi-method approach followed in this research combines factor analysis, multi-dimensional scaling, and bibliometric mapping to identify main HRA research areas. This document reviews over 1200 contributions, with the ultimate goal of identifying current research streams and outlining the potential for future research via a large-scale analysis of contributions indexed in Scopus database

    Application of a SPAR-H based framework to assess human reliability during emergency response drill for man overboard on ships

    Get PDF
    Emergency preparedness is of paramount importance in successful emergency responses at sea. Therefore, emergency drills are regularly conducted to maintain acceptable levels of emergency preparedness. However, it needs to be considered that emergency drill operations themselves include significant risks, and there is no evidence that these risks are appropriately considered when planning emergency drill operations. Human error is one of the main contributors of accidents during emergency drill procedures. The main question posed is how overall risk, including human errors, during an emergency drill can be correctly evaluated. This paper introduces a new hybrid approach based on the Standardised Plant Analysis Risk Human Reliability Analysis (SPAR-H) method with a fuzzy multiple attributive group decision-making method. The method provides a framework for evaluating specific scenarios associated with human errors and identifies contributors that affect human performance. Estimated human errors are utilised to assess human reliability using a new approach based on a system reliability block diagram. The rescue boat drill procedure for a man overboard is selected to illustrate the method. The findings of this research show each human error probability and its contributing factors per task. As a result, overall reliability of 6.06E-01 was obtained for rescue boat drill operation

    Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    Get PDF
    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment

    Bayesian Network Analysis for Diagnostics and Prognostics of Engineering Systems

    Get PDF
    Bayesian networks have been applied to many different domains to perform prognostics, reduce risk and ultimately improve decision making. However, these methods have not been applied to military field and human performance data sets in an industrial environment. Methods frequently rely on a clear understanding of causal connections leading to an undesirable event and detailed understanding of the system behavior. Methods may also require large amount of analyst teams and domain experts, coupled with manual data cleansing and classification. The research performed utilized machine learning algorithms (such as Bayesian networks) and two existing data sets. The primary objective of the research was to develop a diagnostic and prognostic tool utilizing Bayesian networks that does not require the need for detailed causal understanding of the underlying system. The research yielded a predictive method with substantial benefits over reactive methods. The research indicated Bayesian networks can be trained and utilized to predict failure of several important components to include potential malfunction codes and downtime on a real-world Navy data set. The research also considered potential error within the training data set. The results provided credence to utilization of Bayesian networks in real field data – which will always contain error that is not easily quantified. Research should be replicated with additional field data sets from other aircraft. Future research should be conducted to solicit and incorporate domain expertise into subsequent models. Research should also consider incorporation of text based analytics for text fields, which was considered out of scope for this research project

    Reliability assessment of manufacturing systems: A comprehensive overview, challenges and opportunities

    Get PDF
    Reliability assessment refers to the process of evaluating reliability of components or systems during their lifespan or prior to their implementation. In the manufacturing industry, the reliability of systems is directly linked to production efficiency, product quality, energy consumption, and other crucial performance indicators. Therefore, reliability plays a critical role in every aspect of manufacturing. In this review, we provide a comprehensive overview of the most significant advancements and trends in the assessment of manufacturing system reliability. For this, we also consider the three main facets of reliability analysis of cyber–physical systems, i.e., hardware, software, and human-related reliability. Beyond the overview of literature, we derive challenges and opportunities for reliability assessment of manufacturing systems based on the reviewed literature. Identified challenges encompass aspects like failure data availability and quality, fast-paced technological advancements, and the increasing complexity of manufacturing systems. In turn, the opportunities include the potential for integrating various assessment methods, and leveraging data to automate the assessment process and to increase accuracy of derived reliability models

    Systematic Human Reliability Analysis (SHRA): A New Approach to Evaluate Human Error Probability (HEP) in a Nuclear Plant

    Get PDF
    Emergency management in industrial plants is a fundamental issue to ensure the safety of operators. The emergency management analyses two fundamental aspects: the system reliability and the human reliability. System reliability is the capability of ensuring the functional properties within a variability of work conditions, considering the possible deviations due to unexpected events. However, system reliability is strongly related to the reliability of its weakest component. The complexity of the processes could generate incidental situations and the worker appears (human reliability) to be the weakest part of the whole system. The complexity of systems influences operator’s ability to take decisions during emergencies. The aim of the present research is to develop a new approach to evaluate human error probability (HEP), called Systematic Human Reliability Analysis (SHRA). The proposed approach considers internal and external factors that affect operator’s ability. The new approach is based on Nuclear Action Reliability Assessment (NARA), Simplified Plant Analysis Risk Human Reliability (SPAR-H) and on the Performance Shaping Factors (PSFs) relationship. The present paper analysed some shortcomings related to literature approaches, especially the limitations of the working time. We estimated HEP, after 8 hours (work standard) during emergency conditions. The correlations between the advantages of these three methodologies allows proposing a HEP analysis during accident scenarios emergencies. SHRA can be used to estimate human reliability during emergencies. SHRA has been applied in a nuclear accident scenario, considering 24 hours of working time. The SHRA results highlight the most important internal and external factors that affect operator’s ability

    Systematic Human Reliability Analysis (SHRA): A New Approach to Evaluate Human Error Probability (HEP) in a Nuclear Plant

    Get PDF
    Emergency management in industrial plants is a fundamental issue to ensure the safety of operators. The emergency management analyses two fundamental aspects: the system reliability and the human reliability. System reliability is the capability of ensuring the functional properties within a variability of work conditions, considering the possible deviations due to unexpected events. However, system reliability is strongly related to the reliability of its weakest component. The complexity of the processes could generate incidental situations and the worker appears (human reliability) to be the weakest part of the whole system. The complexity of systems influences operator's ability to take decisions during emergencies. The aim of the present research is to develop a new approach to evaluate human error probability (HEP), called Systematic Human Reliability Analysis (SHRA). The proposed approach considers internal and external factors that affect operator's ability. The new approach is based on Nuclear Action Reliability Assessment (NARA), Simplified Plant Analysis Risk Human Reliability (SPAR-H) and on the Performance Shaping Factors (PSFs) relationship. The present paper analysed some shortcomings related to literature approaches, especially the limitations of the working time. We estimated HEP, after 8 hours (work standard) during emergency conditions. The correlations between the advantages of these three methodologies allows proposing a HEP analysis during accident scenarios emergencies. SHRA can be used to estimate human reliability during emergencies. SHRA has been applied in a nuclear accident scenario, considering 24 hours of working time. The SHRA results highlight the most important internal and external factors that affect operator's ability

    Uncertainty Assessment in High-Risk Environments Using Probability, Evidence Theory and Expert Judgment Elicitation

    Get PDF
    The level of uncertainty in advanced system design is assessed by comparing the results of expert judgment elicitation to probability and evidence theory. This research shows how one type of monotone measure, namely Dempster-Shafer Theory of Evidence can expand the framework of uncertainty to provide decision makers a more robust solution space. The issues imbedded in this research are focused on how the relevant predictive uncertainty produced by similar action is measured. This methodology uses the established approach from traditional probability theory and Dempster-Shafer evidence theory to combine two classes of uncertainty, aleatory and epistemic. Probability theory provides the mathematical structure traditionally used in the representation of aleatory uncertainty. The uncertainty in analysis outcomes is represented by probability distributions and typically summarized as Complimentary Cumulative Distribution Functions (CCDFs). The main components of this research are probability of X in the probability theory compared to mx in evidence theory. Using this comparison, an epistemic model is developed to obtain the upper “CCPF - Complimentary Cumulative Plausibility Function” limits and the lower “CCBF - Complimentary Cumulative Belief Function” limits compared to the traditional probability function. A conceptual design for the Thermal Protection System (TPS) of future Crew Exploration Vehicles (CEV) is used as an initial test case. A questionnaire is tailored to elicit judgment from experts in high-risk environments. Based on description and characteristics, the answers of the questionnaire produces information, that serves as qualitative semantics used for the evidence theory functions. The computational mechanism provides a heuristic approach for the compilation and presentation of the results. A follow-up evaluation serves as validation of the findings and provides useful information in terms of consistency and adoptability to other domains. The results of this methodology provide a useful and practical approach in conceptual design to aid the decision maker in assessing the level of uncertainty of the experts. The methodology presented is well-suited for decision makers that encompass similar conceptual design instruments

    A Bayesian Framework for Analysis of Pseudo-spatial Models of Comparable Engineered Systems With Application to Spacecraft Anomaly Prediction Based on Precedent Data

    Get PDF
    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest
    • …
    corecore