27 research outputs found

    Counterfactual conditionals and normative rules

    Get PDF
    The funding was awarded to the second, third and fourth authorsCounterfactual thinking is the consideration of how things could have turned out differently, usually taking the form of counterfactual conditionals. This experiment examined the psychological mechanisms that transform counterfactuals into deontic guidance rules for the future. We examined how counterfactual thinking translates into deontic guidance rules by asking participants to infer these deontic conclusions from the counterfactual premises. Participants were presented with a vignette and a counterfactual conditional, and assigned to either a control condition or a suppression condition in which they were additionally presented with conflicting normative rules. The presence of conflicting norms reduced the likelihood of positive deontic conclusions being endorsed and increased the likelihood of negative deontic conclusions being endorsed. Future intentionality and regret intensity ratings were reduced in the suppression condition. The same conditions that affect normative inference also affect regret and future planning, suggesting similar cognitive mechanisms underlie these processes

    Structured decision-making drives guidelines panelsā€™ recommendations ā€˜forā€™ but not ā€˜againstā€™ health interventions

    Get PDF
    Background: The determinants of guideline panelsā€™ recommendations remain uncertain. Objective: To investigate factors considered by members of 8 panels convened by the American Society of Hematology (ASH) to develop guidelines using GRADE system. Study Design and Setting: web-based survey of the participants in the ASH guidelines panels. Analysis: two level hierarchical, random-effect, multivariable regression analysis to explore the relation between GRADE and non-GRADE factors and strength of recommendations (SOR). Results: In the primary analysis, certainty in evidence [OR=1.83; (95CI% 1.45 to 2.31)], balance of benefits and harms [OR=1.49 (95CI% 1.30 to 1.69)] and variability in patientsā€™ values and preferences [OR=1.47 (95CI% 1.15 to 1.88)] proved the strongest predictors of SOR. In a secondary analysis, certainty of evidence was associated with a strong recommendation [OR=3.60 (95% CI 2.16 to 6.00)] when panel members recommended ā€œforā€ interventions but not when they made recommendations ā€œagainstā€ [OR=0.98 (95%CI: 0.57 to 1.8)] consistent with ā€œyesā€ bias. Agreement between individual members and the group in rating SOR varied (kappa ranged from -0.01 to 0.64). Conclusion: GRADEā€™s conceptual framework proved, in general, highly associated with SOR. Failure of certainty of evidence to be associated with SOR against an intervention, suggest the need for improvements in the process

    Thinking styles and regret in physicians

    Get PDF
    Background Decision-making relies on both analytical and emotional thinking. Cognitive reasoning styles (e.g. maximizing and satisficing tendencies) heavily influence analytical processes, while affective processes are often dependent on regret. The relationship between regret and cognitive reasoning styles has not been well studied in physicians, and is the focus of this paper. Methods A regret questionnaire and 6 scales measuring individual differences in cognitive styles (maximizing-satisficing tendencies; analytical vs. intuitive reasoning; need for cognition; intolerance toward ambiguity; objectivism; and cognitive reflection) were administered through a web-based survey to physicians of the University of South Florida. Bonferroniā€™s adjustment was applied to the overall correlation analysis. The correlation analysis was also performed without Bonferroniā€™s correction, given the strong theoretical rationale indicating the need for a separate hypothesis. We also conducted a multivariate regression analysis to identify the unique influence of predictors on regret. Results 165 trainees and 56 attending physicians (age range 25 to 69) participated in the survey. After bivariate analysis we found that maximizing tendency positively correlated with regret with respect to both decision difficulty (r=0.673; p<0.001) and alternate search strategy (r=0.239; p=0.002). When Bonferroniā€™s correction was not applied, we also found a negative relationship between satisficing tendency and regret (r=-0.156; p=0.021). In trainees, but not faculty, regret negatively correlated with rational-analytical thinking (r=-0.422; p<0.001), need for cognition (r=-0.340; p<0.001), and objectivism (r=-0.309; p=0.003) and positively correlated with ambiguity intolerance (r=0.285; p=0.012). However, after conducting a multivariate regression analysis, we found that regret was positively associated with maximizing only with respect to decision difficulty (r=0.791; p<0.001), while it was negatively associated with satisficing (r=-0.257; p=0.020) and objectivism (r=-0.267; p=0.034). We found no statistically significant relationship between regret and overall accuracy on conditional inferential tasks. Conclusion Regret in physicians is strongly associated with their tendency to maximize; i.e. the tendency to consider more choices among abundant options leads to more regret. However, physicians who exhibit satisficing tendency ā€“ the inclination to accept a ā€œgood enoughā€ solution ā€“ feel less regret. Our observation that objectivism is a negative predictor of regret indicates that the tendency to seek and use empirical data in decision-making leads to less regret. Therefore, promotion of evidence-based reasoning may lead to lower regret

    When fast logic meets slow belief: Evidence for a parallel-processing model of belief bias.

    Get PDF
    Two experiments pitted the default-interventionist account of belief bias against a parallel-processing model. According to the former, belief bias occurs because a fast, belief-based evaluation of the conclusion pre-empts a working-memory demanding logical analysis. In contrast, according to the latter both belief-based and logic-based responding occur in parallel. Participants were given deductive reasoning problems of variable complexity and instructed to decide whether the conclusion was validĀ on half the trials or to decide whether the conclusion was believableĀ on the other half. When belief and logic conflict, the default-interventionist view predicts that it should take less time to respond on the basis of belief than logic, and that the believability of a conclusion should interfere with judgments of validity, but not the reverse. The parallel-processing view predicts that beliefs should interfere with logic judgments only if the processing required to evaluate the logical structure exceeds that required to evaluate the knowledge necessary to make a belief-based judgment, and vice versa otherwise. Consistent with this latter view, for the simplest reasoning problems (modus ponens), judgments of belief resulted in lower accuracy than judgments of validity, and believability interfered more with judgments of validity than the converse. For problems of moderate complexity (modus tollens and single-model syllogisms), the interference was symmetrical, in that validity interfered with belief judgments to the same degree that believability interfered with validity judgments. For the most complex (three-term multiple-model syllogisms), conclusion believability interfered more with judgments of validity than vice versa, in spite of the significant interference from conclusion validity on judgments of belief

    Slower is not always better: Response-time evidence clarifies the limited role of miserly information processing in the Cognitive Reflection Test

    Get PDF
    We report a study examining the role of `cognitive miserliness' as a determinant of poor performance on the standard three-item Cognitive Reflection Test (CRT). The cognitive miserliness hypothesis proposes that people often respond incorrectly on CRT items because of an unwillingness to go beyond default, heuristic processing and invest time and effort in analytic, reflective processing. Our analysis (N = 391) focused on people's response times to CRT items to determine whether predicted associations are evident between miserly thinking and the generation of incorrect, intuitive answers. Evidence indicated only a weak correlation between CRT response times and accuracy. Item-level analyses also failed to demonstrate predicted response time differences between correct analytic and incorrect intuitive answers for two of the three CRT items. We question whether participants who give incorrect intuitive answers on the CRT can legitimately be termed cognitive misers and whether the three CRT items measure the same general construct

    Emotion and Reasoning: A Metacognitive Perspective

    No full text
    The talk will draw on the Thompsonā€™s (2011) theory of metacognition in reasoning, which aims to identify the mechanisms that trigger effortful (type 2) processing. We will discuss the metacognitive perspective in relation to the role of emotional content, a topic not yet integrated into the theory. We suggest that emotion serves as a metacognitive cue to trigger effortful processing. We will present a conditional inference task using fear-related versus neutral materials, matched for believability. The task utilizes a simplified version of the two-response paradigm developed by Thompson et al (2011). Participants provide a fast first response and feeling of rightness (FOR) rating; this is then repeated with no time restriction. Changes between the first and second response provide a measure of effortful thinking. The findings suggest that emotion has a dual role. First, it moderates the effect of FOR: FOR and response change only correlated for fear-related materials, an effect that was replicated across items. Second, fear triggers low FOR, which then activates effortful processing: FOR was lower overall for fear-related materials. This effect was mediated by type of inference: for fear-related materials, participants changed their responses more for the denial inferences (MT, DA) relative to the affirmation inferences (MP, AC). The opposite was true for neutral materials. We discuss whether the effect is task-specific

    On some limits of hypothetical thinking

    No full text
    Faced with extreme demands, hypothetical thinking runs the danger of total failure. Paradoxical propositions such as the Liar ("I am lying") provide an opportunity to test it to its limits, while the Liar's nonparadoxical counterpart, the Truthteller ("I am telling the truth"), provides a useful comparison. Two experiments are reported, one with abstract materials ("If I am a knave then I live in Emerald City") and one with belief-laden materials (a judge says: "If I am a knave then I enjoy pop music"). In both experiments, conditionals with Truthteller-type antecedents were "collapsed" to responses of conditional probability closely resembling estimates of control items. Liar-type antecedents, in contrast, dramatically weakened belief in conditionals in which they were embedded. The results are discussed in the framework of the theory of hypothetical thinking.25 page(s

    Utilitarian Moral Judgment Exclusively Coheres with Inference from Is to Ought

    Get PDF
    Faced with moral choice, people either judge according to pre-existing obligations (deontological judgment), or by taking into account the consequences of their actions (utilitarian judgment). We propose that the latter coheres with a more general cognitive mechanism ā€“ deontic introduction, the tendency to infer normative (ā€˜deonticā€™) conclusions from descriptive premises (is-ought inference). Participants were presented with vignettes that allowed either deontological or utilitarian choice, and asked to draw a range of deontic conclusions, as well as judge the overall moral rightness of each choice separately. We predicted and found a selective defeasibility pattern, in which manipulations that suppressed deontic introduction also suppressed utilitarian moral judgment, but had little effect on deontological moral judgment. Thus, deontic introduction coheres with utilitarian moral judgment almost exclusively. We suggest a family of norm-generating informal inferences, in which normative conclusions are drawn from descriptive (although value-laden) premises. This family includes deontic introduction and utilitarian moral judgment as well as other informal inferences. We conclude with a call for greater integration of research in moral judgment and research into deontic reasoning and informal inference

    Deontic Introduction: A Theory of Inference from Is to Ought

    No full text
    Humans have a unique ability to generate novel norms. Faced with the knowledge that there are hungry children in Somalia, we easily and naturally infer that we ought to donate to famine relief charities. Although a contentious and lively issue in metaethics, such inference from ā€˜isā€™ to ā€˜oughtā€™ has not been systematically studied in the psychology of reasoning. We propose that deontic introduction is the result of a rich chain of pragmatic inference, most of it implicit; specifically, when an action is causally linked to a valenced goal, valence transfers to the action and bridges into a deontic conclusion. Participants in five experiments were presented with utility conditionals in which an action results in a benefit, a cost, or neutral outcome (If Lisa buys the booklet, she will pass the exam), and asked to evaluate how strongly deontic conclusions (Lisa should buy the booklet) follow from the premises. Findings show that the direction of the conclusions was determined by outcome valence (Experiment 1a, 1b), whereas their strength was determined by the strength of the causal link between action and outcome (Experiments 1, 2a and 2b). We also found that deontic introduction is defeasible, and can be suppressed by additional premises which interfere with any of the links in the implicit chain of inference (Experiments 2a, 2b and 3). We propose that deontic introduction is a species-specific generative capacity whose function is to regulate future behaviour
    corecore