3 research outputs found

    Cognitive debiasing 1: Origins of bias and theory of debiasing

    Get PDF
    Numerous studies have shown that diagnostic failure depends upon a variety of factors. Psychological factors are fundamental in influencing the cognitive performance of the decision maker. In this first of two papers, we discuss the basics of reasoning and the Dual Process Theory (DPT) of decision making. The general properties of the DPT model, as it applies to diagnostic reasoning, are reviewed. A variety of cognitive and affective biases are known to compromise the decision-making process. They mostly appear to originate in the fast intuitive processes of Type 1 that dominate (or drive) decision making. Type 1 processes work well most of the time but they may open the door for biases. Removing or at least mitigating these biases would appear to be an important goal. We will also review the origins of biases. The consensus is that there are two major sources: innate, hard-wired biases that developed in our evolutionary past, and acquired biases established in the course of development and within our working environments. Both are associated with abbreviated decision making in the form of heuristics. Other work suggests that ambient and contextual factors may create high risk situations that dispose decision makers to particular biases. Fatigue, sleep deprivation and cognitive overload appear to be important determinants. The theoretical basis of several approaches towards debiasing is then discussed. All share a common feature that involves a deliberate decoupling from Type 1 intuitive processing and moving to Type 2 analytical processing so that eventually unexamined intuitive judgments can be submitted to verification. This decoupling step appears to be the critical feature of cognitive and affective debiasing

    Communicating climate change risks in a skeptical world

    Get PDF
    The Intergovernmental Panel on Climate Change (IPCC) has been extraordinarily successful in the task of knowledge synthesis and risk assessment. However, the strong scientific consensus on the detection, attribution, and risks of climate change stands in stark contrast to widespread confusion, complacency and denial among policymakers and the public. Risk communication is now a major bottleneck preventing science from playing an appropriate role in climate policy. Here I argue that the ability of the IPCC to fulfill its mission can be enhanced through better understanding of the mental models of the audiences it seeks to reach, then altering the presentation and communication of results accordingly. Few policymakers are trained in science, and public understanding of basic facts about climate change is poor. But the problem is deeper. Our mental models lead to persistent errors and biases in complex dynamic systems like the climate and economy. Where the consequences of our actions spill out across space and time, our mental models have narrow boundaries and focus on the short term. Where the dynamics of complex systems are conditioned by multiple feedbacks, time delays, accumulations and nonlinearities, we have difficulty recognizing and understanding feedback processes, underestimate time delays, and do not understand basic principles of accumulation or how nonlinearities can create regime shifts. These problems arise not only among laypeople but also among highly educated elites with significant training in science. They arise not only in complex systems like the climate but also in familiar contexts such as filling a bathtub. Therefore they cannot be remedied merely by providing more information about the climate, but require different modes of communication, including experiential learning environments such as interactive simulations
    corecore