6 research outputs found

    Using graphical models and multi-attribute utility theory for probabilistic uncertainty handling in large systems, with application to nuclear emergency management

    Get PDF
    Although many decision-making problems involve uncertainty, uncertainty handling within large decision support systems (DSSs) is challenging. One domain where uncertainty handling is critical is emergency response management, in particular nuclear emergency response, where decision making takes place in an uncertain, dynamically changing environment. Assimilation and analysis of data can help to reduce these uncertainties, but it is critical to do this in an efficient and defensible way. After briefly introducing the structure of a typical DSS for nuclear emergencies, the paper sets up a theoretical structure that enables a formal Bayesian decision analysis to be performed for environments like this within a DSS architecture. In such probabilistic DSSs many input conditional probability distributions are provided by different sets of experts overseeing different aspects of the emergency. These probabilities are then used by the decision maker (DM) to find her optimal decision. We demonstrate in this paper that unless due care is taken in such a composite framework, coherence and rationality may be compromised in a sense made explicit below. The technology we describe here builds a framework around which Bayesian data updating can be performed in a modular way, ensuring both coherence and efficiency, and provides sufficient unambiguous information to enable the DM to discover her expected utility maximizing policy

    Bayesian decision support for complex systems with many distributed experts

    Get PDF
    Complex decision support systems often consist of component modules which, encoding the judgements of panels of domain experts, describe a particular sub-domain of the overall system. Ideally these modules need to be pasted together to provide a comprehensive picture of the whole process. The challenge of building such an integrated system is that, whilst the overall qualitative features are common knowledge to all, the explicit forecasts and their associated uncertainties are only expressed individually by each panel, resulting from its own analysis. The structure of the integrated system therefore needs to facilitate the coherent piecing together of these separate evaluations. If such a system is not available there is a serious danger that this might drive decision makers to incoherent and so indefensible policy choices. In this paper we develop a graphically based framework which embeds a set of conditions, consisting of the agreement usually made in practice of certain probability and utility models, that, if satisfied in a given context, are sufficient to ensure the composite system is truly coherent. Furthermore, we develop new message passing algorithms entailing the transmission of expected utility scores between the panels, that enable the uncertainties within each module to be fully accounted for in the evaluation of the available alternatives in these composite systems

    Expert judgement combination using moment methods

    No full text
    Moment methods have been employed in decision analysis, partly to avoid the computational burden that decision models involving continuous probability distributions can suffer from. In the Bayes linear (BL) methodology prior judgements about uncertain quantities are specified using expectation (rather than probability) as the fundamental notion. BL provides a strong foundation for moment methods, rooted in work of De Finetti and Goldstein. The main objective of this paper is to discuss in what way expert assessments of moments can be combined, in a non-Bayesian way, to construct a prior assessment. We show that the linear pool can be justified in an analogous but technically different way to linear pools for probability assessments, and that this linear pool has a very convenient property: a linear pool of experts' assessments of moments is coherent if each of the experts has given coherent assessments. To determine the weights of the linear pool we give a method of performance based weighting analogous to Cooke's classical model and explore its properties. Finally, we compare its performance with the classical model on data gathered in applications of the classical model
    corecore