4 research outputs found

    A New Probabilistic Explanation of the Modus Ponens–Modus Tollens Asymmetry

    Get PDF
    A consistent finding in research on conditional reasoning is that individuals are more likely to endorse the valid modus ponens (MP) inference than the equally valid modus tollens (MT) inference. This pattern holds for both abstract task and probabilistic task. The existing explanation for this phenomenon within a Bayesian framework (e.g., Oaksford & Chater, 2008) accounts for this asymmetry by assuming separate probability distributions for both MP and MT. We propose a novel explanation within a computational-level Bayesian account of reasoning according to which “argumentation is learning”. We show that the asymmetry must appear for certain prior probability distributions, under the assumption that the conditional inference provides the agent with new information that is integrated into the existing knowledge by minimizing the Kullback-Leibler divergence between the posterior and prior probability distribution. We also show under which conditions we would expect the opposite pattern, an MT-MP asymmetr

    A new probabilistic explanation of the Modus Ponens–Modus Tollens asymmetry

    Get PDF
    A consistent finding in research on conditional reasoning is that individuals are more likely to endorse the valid modus ponens (MP) inference than the equally valid modus tollens (MT) inference. This pattern holds for both abstract task and probabilistic task. The existing explanation for this phenomenon within a Bayesian framework (e.g., Oaksford & Chater, 2008) accounts for this asymmetry by assuming separate probability distributions for both MP and MT. We propose a novel explanation within a computational-level Bayesian account of reasoning according to which “argumentation is learning”. We show that the asymmetry must appear for certain prior probability distributions, under the assumption that the conditional inference provides the agent with new information that is integrated into the existing knowledge by minimizing the Kullback-Leibler divergence between the posterior and prior probability distribution. We also show under which conditions we would expect the opposite pattern, an MT-MP asymmetry
    corecore