15 research outputs found

    The Limits of Rational Belief Revision: A Dilemma for the Darwinian Debunker

    Get PDF
    We are fallible creatures, prone to making all sorts of mistakes. So, we should be open to evidence of error. But what constitutes such evidence? And what is it to rationally accommodate it? I approach these questions by considering an evolutionary debunking argument according to which (a) we have good, scientific, reason to think our moral beliefs are mistaken, and (b) rationally accommodating this requires revising our confidence in, or altogether abandoning the suspect beliefs. I present a dilemma for such debunkers, which shows that either we have no reason to worry about our moral beliefs, or we do but we can self-correct. Either way, moral skepticism doesn’t follow. That the evolutionary debunking argument fails is important; also important, however, is what its failure reveals about rational belief revision. Specifically, it suggests that getting evidence of error is a non-trivial endeavor and that we cannot learn that we are likely to be mistaken about some matter from a neutral stance on that matter

    "Deliberation and prediction: it's complicated"

    Get PDF
    Alan Hájek launches a formidable attack on the idea that deliberation crowds out prediction – that when we are deliberating about what to do, we cannot rationally accommodate evidence about what we are likely to do. Although Hájek rightly diagnoses the problems with some of the arguments for the view, his treatment falls short in crucial ways. In particular, he fails to consider the most plausible version of the view, the best argument for it, and why anyone would ever believe it in the first place. In doing so, he misses a deep puzzle about deliberation and prediction – a puzzle which all of us, as agents, face, and which we may be able to resolve by recognizing the complicated relationship between deliberation and prediction

    Moral disagreement and moral skepticism

    Get PDF

    Open-Mindedness, Rational Confidence, and Belief Change

    Get PDF
    It’s intuitive to think that (a) the more sure you are of something, the harder it’ll be to change your mind about it, and (b) you can’t be open-minded about something if you’re very sure about it. If these thoughts are right, then, with minimal assumptions, it follows that you can’t be in a good position to both escape echo chambers and be rationally resistant to fake news: the former requires open-mindedness, but the latter is inimical to it. I argue that neither thought is true and that believing them will get us all mixed up. I show that you can be open-minded and have confidently held beliefs, and that beliefs in which you are less sure are not, thereby, more fragile. I close with some reflections on the nature of rational belief change and open-mindedness and a brief sketch about what might actually help us in the fight against misinformation and belief polarization …. [please read below the rest of the article]

    Confidence, Evidence, and Disagreement

    Get PDF
    Abstract Should learning we disagree about p lead you to reduce confidence in p? Some who think so want to except beliefs in which you are rationally highly confident. I argue that this is wrong; we should reject accounts that rely on this intuitive thought. I then show that quite the opposite holds: factors that justify low confidence in p also make disagreement about p less significant. I examine two such factors: your antecedent expectations about your peers ’ opinions and the difficulty of evaluating your evidence. I close by proposing a different way of thinking about disagreement.

    Evolutionary Debunking of Moral Realism

    Get PDF

    DELIBERATION AND PREDICTION: IT'S COMPLICATED

    Full text link

    Evidence: A Guide for the Uncertain

    Get PDF
    Assume that it is your evidence that determines what opinions you should have. I argue that since you should take peer disagreement seriously, evidence must have two features. (1) It must sometimes warrant being modest: uncertain what your evidence warrants, and (thus) uncertain whether you’re rational. (2) But it must always warrant being guided: disposed to treat your evidence as a guide. Surprisingly, it is very difficult to vindicate both (1) and (2). But diagnosing why this is so leads to a proposal—Trust—that is weak enough to allow modesty but strong enough to yield many guiding features. In fact, I claim that Trust is the Goldilocks principle—for it is necessary and sufficient to vindicate the claim that you should always prefer to use free evidence. Upshot: Trust lays the foundations for a theory of disagreement and, more generally, an epistemology that permits self-doubt—a modest epistemology

    Debunking Evolutionary Debunking

    No full text
    corecore