9 research outputs found

    Structural mapping in statistical word problems: A relational reasoning approach to Bayesian inference

    Get PDF
    Presenting natural frequencies facilitates Bayesian inferences relative to using percentages. Nevertheless, many people, including highly educated and skilled reasoners, still fail to provide Bayesian responses to these computationally simple problems. We show that the complexity of relational reasoning (e.g., the structural mapping between the presented and requested relations) can help explain the remaining difficulties. With a non-Bayesian inference that required identical arithmetic but afforded a more direct structural mapping, performance was universally high. Furthermore, reducing the relational demands of the task through questions that directed reasoners to use the presented statistics, as compared with questions that prompted the representation of a second, similar sample, also significantly improved reasoning. Distinct error patterns were also observed between these presented- and similar-sample scenarios, which suggested differences in relational-reasoning strategies. On the other hand, while higher numeracy was associated with better Bayesian reasoning, higher-numerate reasoners were not immune to the relational complexity of the task. Together, these findings validate the relational-reasoning view of Bayesian problem solving and highlight the importance of considering not only the presented task structure, but also the complexity of the structural alignment between the presented and requested relations

    Base Rate

    No full text

    Whose statistical reasoning is improved by information about causal structure?

    No full text
    People often struggle when making Bayesian probabilistic estimates on the basis of competing sources of statistical evidence. Recently, Krynski and Tenenbaum (Journal of Experimental Psychology: General, 136, 430–450, 2007) proposed that a causal Bayesian framework accounts for peoples’ errors in Bayesian reasoning and showed that, by clarifying the causal relations among the pieces of evidence, judgments on a classic statistical reasoning problem could be significantly improved. We aimed to understand whose statistical reasoning is facilitated by the causal structure intervention. In Experiment 1, although we observed causal facilitation effects overall, the effect was confined to participants high in numeracy. We did not find an overall facilitation effect in Experiment 2 but did replicate the earlier interaction between numerical ability and the presence or absence of causal content. This effect held when we controlled for general cognitive ability and thinking disposition. Our results suggest that clarifying causal structure facilitates Bayesian judgments, but only for participants with sufficient understanding of basic concepts in probability and statistics

    Asking the right questions about the psychology of human inquiry: Nine open challenges

    Get PDF
    The ability to act on the world with the goal of gaining information is core to human adaptability and intelligence. Perhaps the most successful and influential account of such abilities is the Optimal Experiment Design (OED) hypothesis, which argues that humans intuitively perform experiments on the world similar to the way an effective scientist plans an experiment. The widespread application of this theory within many areas of psychology calls for a critical evaluation of the theory’s core claims. Despite many successes, we argue that the OED hypothesis remains lacking as a theory of human inquiry and that research in the area often fails to confront some of the most interesting and important questions. In this critical review, we raise and discuss nine open questions about the psychology of human inquiry

    Asking the right questions about the psychology of human inquiry: Nine open challenges

    No full text
    corecore