19,112 research outputs found

    Practical values and uncertainty in regulatory decision making

    Get PDF
    Regulatory science, which generates knowledge relevant for regulatory decision‐making, is different from standard academic science in that it is oriented mainly towards the attainment of non‐epistemic (practical) aims. The role of uncertainty and the limits to the relevance of academic science are being recognized more and more explicitly in regulatory decision‐making. This has led to the introduction of regulation‐specific scientific methodologies in order to generate decision‐relevant data. However, recent practical experience with such non‐standard methodologies indicates that they, too, may be subject to important limitations. We argue that the attainment of non‐epistemic values and aims (like the protection of human health and the environment) requires not only control of the quality of the data and the methodologies, but also the selection of the level of regulation deemed adequate in each specific case (including a decision about which of the two, under‐regulation or over‐regulation, would be more acceptable)

    Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms

    Get PDF
    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action

    Uncertainty Assessment in High-Risk Environments Using Probability, Evidence Theory and Expert Judgment Elicitation

    Get PDF
    The level of uncertainty in advanced system design is assessed by comparing the results of expert judgment elicitation to probability and evidence theory. This research shows how one type of monotone measure, namely Dempster-Shafer Theory of Evidence can expand the framework of uncertainty to provide decision makers a more robust solution space. The issues imbedded in this research are focused on how the relevant predictive uncertainty produced by similar action is measured. This methodology uses the established approach from traditional probability theory and Dempster-Shafer evidence theory to combine two classes of uncertainty, aleatory and epistemic. Probability theory provides the mathematical structure traditionally used in the representation of aleatory uncertainty. The uncertainty in analysis outcomes is represented by probability distributions and typically summarized as Complimentary Cumulative Distribution Functions (CCDFs). The main components of this research are probability of X in the probability theory compared to mx in evidence theory. Using this comparison, an epistemic model is developed to obtain the upper “CCPF - Complimentary Cumulative Plausibility Function” limits and the lower “CCBF - Complimentary Cumulative Belief Function” limits compared to the traditional probability function. A conceptual design for the Thermal Protection System (TPS) of future Crew Exploration Vehicles (CEV) is used as an initial test case. A questionnaire is tailored to elicit judgment from experts in high-risk environments. Based on description and characteristics, the answers of the questionnaire produces information, that serves as qualitative semantics used for the evidence theory functions. The computational mechanism provides a heuristic approach for the compilation and presentation of the results. A follow-up evaluation serves as validation of the findings and provides useful information in terms of consistency and adoptability to other domains. The results of this methodology provide a useful and practical approach in conceptual design to aid the decision maker in assessing the level of uncertainty of the experts. The methodology presented is well-suited for decision makers that encompass similar conceptual design instruments

    Building fragility curves of sliding failure of concrete gravity dams integrating natural and epistemic uncertainties

    Full text link
    [EN] The proposed procedure combines the concepts of the Electrical Power Research Institute (EPRI) guidelines to develop fragility curves for the nuclear industry with existing reliability techniques for computing fragility curves in the context of concrete dams engineering. The procedure has been applied to a dam to illustrate how it can be used in a real case in such a manner that fragility curves are obtained integrating natural and epistemic uncertainties without losing track of their separate contribution to risk results. (C) 2016 Elsevier Ltd. All rights reserved.This paper was published with the support of the research project "INICIA" (Methodology for Assessing Investments on Water Cycle Infrastructures informed on Risk and Energy Efficiency Indicators, BIA2013-48157-C2-1-R, 2014-2016), co-funded by the Spanish Ministry of Economy and Competitiveness "Ministerio de Economia y Competitividad" (Programa Estatal de Investigacion, Desarrollo e Innovation Orientada a los Retos de la Sociedad) and the European Regional Development Fund (ERDF).Morales Torres, A.; Escuder Bueno, I.; Altarejos GarcĂ­a, L.; Serrano Lombillo, AJ. (2016). Building fragility curves of sliding failure of concrete gravity dams integrating natural and epistemic uncertainties. Engineering Structures. 125:227-235. https://doi.org/10.1016/j.engstruct.2016.07.006S22723512

    A cost estimate maturity benchmark method to support early concept design decision-making: a case study application to the small modular nuclear reactor

    Get PDF
    Constructing large Nuclear Power Plants (NPPs) is synonymous with significant cost and schedule uncertainty. Innovative Small Modular Reactors (SMRs) have been identified as a way of increasing certainty of delivery, whilst also maintaining a competitive Life Cycle Cost (LCC). Previous research into the cost of SMRs has focused on the economics of a design from the perspective of an owner or investor. There is a significant gap in the literature associated with cost estimating SMRs at the early concept development stage from the perspective of a reactor developer. Early design stage cost estimates are inherently uncertain. Design teams, therefore, need to make decisions that will achieve a cost competitive product by considering uncertainty. Existing cost uncertainty analysis methods lack standardisation in their application, often relying on the subjective assessment of experts. The central argument presented in this research is that the SMR vendor can make more effective decisions related to achieving cost certainty by understanding the drivers of knowledge uncertainty associated with early design stage cost estimates. This thesis describes research spanning the concept design phase of the UK SMR development programme. The research investigation is divided into two distinct phases. The first phase identifies the requirements for cost information from the perspective of the SMR vendor through interviews, a participatory case study investigation and surveys. Limited access to cost information means that early design cost assessment is highly subjective. Cost uncertainty analysis should provide decision makers with an understanding of the level of confidence associated with the estimate. A survey investigating how cost information is interpreted revealed that providing more granular detail about cost uncertainty would support the design team with additional rationale for selecting a design option. The main requirement identified from phase 1 of the research is the need for a standardised method to identify how sources of cost uncertainty influence the maturity of the estimate at each stage of the design development process. The second phase of the research involved a participatory research approach where the Acceptable Cost Uncertainty Benchmark Assessment (ACUBA) method was developed and then implemented retrospectively on the case study cost data. The ACUBA method uses a qualitative measure to assess the quality and impact of engineering definition, manufacturing process knowledge and supply chain knowledge on the cost estimate confidence. The maturity rating is then assessed against a benchmark to determine the acceptability of the estimate uncertainty range. Focus groups were carried out in the vendor organisation to investigate whether the design team could clarify their reasoning for decisions related to reducing cost uncertainty when given insight into the sources of cost uncertainty. The rationale for a decision is found to be clearer using the ACUBA method compared with existing cost uncertainty analysis methods used by the case study organisation. This research has led to the development of a novel method which standardises and improves the communication of cost information across different functions within a design team. By establishing a benchmark acceptable level of cost maturity for a decision, the cost maturity metric can be employed to measure the performance of the SMR development programme towards achieving product cost maturity. In addition, the ACUBA method supports the more effective allocation of limited resources available at the early design stage, by identifying design activities which could lead to an acceptable cost maturity.</div

    Analytic Narratives: What they are and how they contribute to historical explanation

    Get PDF
    The expression "analytic narratives" is used to refer to a range of quite recent studies that lie on the boundaries between history, political science, and economics. These studies purport to explain specific historical events by combining the usual narrative approach of historians with the analytic tools that economists and political scientists draw from formal rational choice theories. Game theory, especially of the extensive form version, is currently prominent among these tools, but there is nothing inevitable about such a technical choice. The chapter explains what analytic narratives are by reviewing the studies of the major book Analytic Narratives (1998), which are concerned with the workings of political institutions broadly speaking, as well as several cases drawn from military and security studies, which form an independent source of the analytic narratives literature. At the same time as it gradually develops a definition of analytic narratives, the chapter investigates how they fulfil one of their main purposes, which is to provide explanations of a better standing than those of traditional history. An important principle that will emerge in the course of the discussion is that narration is called upon not only to provide facts and problems, but also to contribute to the explanation itself. The chapter distinguishes between several expository schemes of analytic narratives according to the way they implement this principle. From all the arguments developed here, it seems clear that the current applications of analytic narratives do not exhaust their potential, and in particular that they deserve the attention of economic historians, if only because they are concerned with microeconomic interactions that are not currently their focus of attention
    • 

    corecore