969 research outputs found

    A probabilistic analysis of argument cogency

    Get PDF
    This paper offers a probabilistic treatment of the conditions for argument cogency as endorsed in informal logic: acceptability, relevance, and sufficiency. Treating a natural language argument as a reason-claim-complex, our analysis identifies content features of defeasible argument on which the RSA conditions depend, namely: change in the commitment to the reason, the reason’s sensitivity and selectivity to the claim, one’s prior commitment to the claim, and the contextually determined thresholds of acceptability for reasons and for claims. Results contrast with, and may indeed serve to correct, the informal understanding and applications of the RSA criteria concerning their conceptual dependence, their function as update-thresholds, and their status as obligatory rather than permissive norms, but also show how these formal and informal normative approachs can in fact align

    Rational understanding: toward a probabilistic epistemology of acceptability

    Get PDF
    To understand something involves some sort of commitment to a set of propositions comprising an account of the understood phenomenon. Some take this commitment to be a species of belief; others, such as Elgin and I, take it to be a kind of cognitive policy. This paper takes a step back from debates about the nature of understanding and asks when this commitment involved in understanding is epistemically appropriate, or ‘acceptable’ in Elgin’s terminology. In particular, appealing to lessons from the lottery and preface paradoxes, it is argued that this type of commitment is sometimes acceptable even when it would be rational to assign arbitrarily low probabilities to the relevant propositions. This strongly suggests that the relevant type of commitment is sometimes acceptable in the absence of epistemic justification for belief, which in turn implies that understanding does not require justification in the traditional sense. The paper goes on to develop a new probabilistic model of acceptability, based on the idea that the maximally informative accounts of the understood phenomenon should be optimally probable. Interestingly, this probabilistic model ends up being similar in important ways to Elgin’s proposal to analyze the acceptability of such commitments in terms of ‘reflective equilibrium’

    Deductive Cogency, understanding, and acceptance

    Get PDF
    Deductive Cogency holds that the set of propositions towards which one has, or is prepared to have, a given type of propositional attitude should be consistent and closed under logical consequence. While there are many propositional attitudes that are not subject to this requirement, e.g. hoping and imagining, it is at least prima facie plausible that Deductive Cogency applies to the doxastic attitude involved in propositional knowledge, viz. belief. However, this thought is undermined by the well-known preface paradox, leading a number of philosophers to conclude that Deductive Cogency has at best a very limited role to play in our epistemic lives. I argue here that Deductive Cogency is still an important epistemic requirement, albeit not as a requirement on belief. Instead, building on a distinction between belief and acceptance introduced by Jonathan Cohen and recent developments in the epistemology of understanding, I propose that Deductive Cogency applies to the attitude of treating propositions as given in the context of attempting to understand a given phenomenon. I then argue that this simultaneously accounts for the plausibility of the considerations in favor of Deductive Cogency and avoids the problematic consequences of the preface paradox

    There Is No Pure Empirical Reasoning

    Get PDF
    The justificatory force of empirical reasoning always depends upon the existence of some synthetic, a priori justification. The reasoner must begin with justified, substantive constraints on both the prior probability of the conclusion and certain conditional probabilities; otherwise, all possible degrees of belief in the conclusion are left open given the premises. Such constraints cannot in general be empirically justified, on pain of infinite regress. Nor does subjective Bayesianism offer a way out for the empiricist. Despite often-cited convergence theorems, subjective Bayesians cannot hold that any empirical hypothesis is ever objectively justified in the relevant sense. Rationalism is thus the only alternative to an implausible skepticism

    Facts, Norms and Expected Utility Functions

    Get PDF
    In this paper we want to explore an argumentative pattern that provides a normative justification for expected utility functions grounded on empirical evidence, showing how it worked in three different episodes of their development. The argument claims that we should prudentially maximize our expected utility since this is the criterion effectively applied by those who are considered wisest in making risky choices (be it gamblers or businessmen). Yet, to justify the adoption of this rule, it should be proven that this is empirically true: i.e., that a given function allows us to predict the choices of that particular class of agents. We show how expected utility functions were introduced and contested in accordance to this pattern in the 18th century and how it recurred in the 1950s when M. Allais made his case against the neobernoullians.Expected utility;Normative theory;

    Four Approaches to Supposition

    Get PDF
    The primary purpose of this paper is to shed light on the structure of four varieties of normative theories of supposition by systematically explicating the relationships between canonical representatives of each. These include qualitative and quantitative theories of indicative and subjunctive supposition. We approach this project by treating supposition as a form of 'provisional belief revision' in which a person temporarily accepts the supposition as true and makes some appropriate changes to her other opinions so as to accommodate their supposition. The idea is that suppositional judgments are supposed to reflect an agent's judgments about how things would be in some hypothetical state of affairs satisfying the supposition. Accordingly, our representative qualitative theories of indicative and subjunctive supposition are respectively based on AGM revision and KM update, while our representative quantitative ones are provided by conditionalization and imaging. We rely on a suitably adapted version of the Lockean thesis to generate qualitative judgments based on our representative quantitative theories. Ultimately, a number of new results are established that vindicate the often repeated claim that conditionalization is a probabilistic version of revision, while imaging is a probabilistic version of update

    Hume’s problem, epistemic deductivism and the validation of induction

    Get PDF
    Contrary to Owen (2000), Hume's problem is, as has traditionally been supposed, a problem for the justification of inductive inference. But, contrary to tradition, induction on Hume's account is not deductively invalid. Furthermore, on a more modem conception of inductive or ampliative inference, it is a mistake to suppose that the proper construal of an argument explicating the supposed justification for such inferences should in general be non-deductive. On a general requirement for argument cogency that arguments should be suitably constructed so as to make it clear to the audience that the subject is justified, on whatever basis is cited, in regarding the hypothesis with whatever epistemic attitude the arguer purports to be so justified, arguments in general, fully explicated and properly construed, should be deductively valid. Hume’s problem does not prevent such justification because his crucial argument establishes only that our basic assumptions cannot be justified, in the sense of being 'proven', or shown by non-question-begging argument to be just. It does not establish that our basic assumptions, properly explicated, are not just, or that they are not (at least to the satisfaction of most of us) clearly so. Nor does Goodman's 'new riddle' of induction pose a serious problem for the justification of our inductive inferences, as is still commonly suggested, since Jackson figured out the solution to the riddle thirty years ago. There is an analogous problem to Hume’s for the provability of principles or claims of deductive inferability, and if my analysis of the proper construal structure of argument (in the natural sense) is correct, this will block Howson's (2000) proposed escape route. Nevertheless, as with the case of induction, the unprovability of basic claims and principles of deductive inferability does not bar their deployment in cogent justifications

    Walton’s Argumentation Schemes

    Get PDF
    The contribution critically discusses Walton\u27s (and Reed’s and Macagno’s) argumentation scheme approach. On the one hand, its enormous richness and closeness to the empirical argumentation material is appreciated, but, on the other, fundamental conceptual weaknesses are revealed. Although the approach more recently has been declared to strive for “true beliefs and correct choices” it has not systematically developed the proposed schemes in a way that these goals are reached. Accordingly, many proposed schemes are fallacious from an epistemological standpoint

    Full & Partial Belief

    Get PDF

    The Copernican Principle, Intelligent Extraterrestrials, and Arguments from Evil

    Get PDF
    The physicist Richard Gott defends the Copernican principle, which claims that when we have no information about our position along a given dimension among a group of observers, we should consider ourselves to be randomly located among those observers in respect to that dimension. First, I apply Copernican reasoning to the distribution of evil in the universe. I then contend that evidence for intelligent extraterrestrial life strengthens four important versions of the argument from evil. I remain neutral regarding whether this result is a reductio of these arguments from evil or the statement of a genuine evidential relationship
    • 

    corecore