41,735 research outputs found

    Coherence as a Test for Truth

    Get PDF
    This paper sets out to demonstrate that a contrast can be drawn between coherentism as an account of the structure of justification, and coherentism as a method of inquiry. Whereas the former position aims to offer an answer to the ‘regress of justification’ problem, the latter position claims that coherence plays a vital and indispensable role as a criterion of truth, given the fallibility of cognitive methods such as perception and memory. It is argued that ‘early’ coherentists like Bradley and Blanshard were coherentists of the latter kind, and that this sort of coherentism is not open to certain sorts of standard objection that can be raised against justificatory coherentism

    The evidential weight of considered moral judgments

    Get PDF
    The input objection to reflective equilibrium (RE) claims that the method fails as a method of moral justification. According to the objection, considered moral judgments (CMJs) are not truth-conducive. Because the method uses inputs that are not credible, the method does not generate justified moral beliefs. The objection is solved by reinterpreting RE using contemporary developments in ethical intuitionism. The first half of the thesis sets up the input objection, explores potential responses to the objection, and uncovers the best way to solve the objection. The second half of the thesis solves the input objection by defining key terms, detailing the revised RE procedure, reinserting the notion of a competent moral judge into the method, using intuitionist criteria for identifying genuine moral intuitions, creating three filters capable of sorting good from bad CMJs, and showing how it is possible to assign evidential weight to CMJs so that they can be used as standards against which moral principles can be measured and a justified moral theory realized

    Literal Perceptual Inference

    Get PDF
    In this paper, I argue that theories of perception that appeal to Helmholtz’s idea of unconscious inference (“Helmholtzian” theories) should be taken literally, i.e. that the inferences appealed to in such theories are inferences in the full sense of the term, as employed elsewhere in philosophy and in ordinary discourse. In the course of the argument, I consider constraints on inference based on the idea that inference is a deliberate acton, and on the idea that inferences depend on the syntactic structure of representations. I argue that inference is a personal-level but sometimes unconscious process that cannot in general be distinguished from association on the basis of the structures of the representations over which it’s defined. I also critique arguments against representationalist interpretations of Helmholtzian theories, and argue against the view that perceptual inference is encapsulated in a module

    Commentary on Godden

    Get PDF

    A Comprehensive Framework for Controlled Query Evaluation, Consistent Query Answering and KB Updates in Description Logics

    Get PDF
    In this extended abstract we discuss the relationship between confidentiality-preserving frameworks and inconsistency-tolerant repair and update semantics in Description Logics (DL). In particular, we consider the well-known problems of Consistent Query Answering, Controlled Query Evaluation, and Knowledge Base Update in DL and introduce a unifying framework that can be naturally instantiated to capture significant settings for the above problems, previously investigated in the literature

    Is the mind Bayesian? The case for agnosticism

    Get PDF
    This paper aims to make explicit the methodological conditions that should be satisfied for the Bayesian model to be used as a normative model of human probability judgment. After noticing the lack of a clear definition of Bayesianism in the psychological literature and the lack of justification for using it, a classic definition of subjective Bayesianism is recalled, based on the following three criteria: An epistemic criterion, a static coherence criterion and a dynamic coherence criterion. Then it is shown that the adoption of this framework has two kinds of implications. The first one regards the methodology of the experimental study of probability judgment. The Bayesian framework creates pragmatic constraints on the methodology that are linked to the interpretation of, and the belief in, the information presented, or referred to, by an experimenter in order for it to be the basis of a probability judgment by individual participants. It is shown that these constraints have not been satisfied in the past, and the question of whether they can be satisfied in principle is raised and answered negatively. The second kind of implications consists of two limitations in the scope of the Bayesian model. They regard (i) the background of revision (the Bayesian model considers only revising situations but not updating situations), and (ii) the notorious case of the null priors. In both cases Lewis' rule is an appropriate alternative to Bayes' rule, but its use faces the same operational difficulties

    What If Bizet and Verdi Had Been Compatriots?

    Get PDF
    Stalnaker argued that conditional excluded middle should be included in the principles that govern counterfactuals on the basis that intuitions support that principle. This is because there are pairs of competing counterfactuals that appear to be equally acceptable. In doing so, he was forced to introduced semantic vagueness into his system of counterfactuals. In this paper it is argued that there is a simpler and purely epistemic explanation of these cases that avoids the need for introducing semantic vagueness into the semantics for counterfactuals

    Rationalizability and Minimal Complexity in Dynamic Games

    Get PDF
    This paper presents a formal epistemic framework for dynamic games in which players, during the course of the game, may revise their beliefs about the opponents'' utility functions. We impose three key conditions upon the players'' beliefs: (a) throughout the game, every move by the opponent should be interpreted as being part of a rational strategy, (b) the belief about the opponents'' relative ranking of two strategies should not be revised unless one is certain that the opponent has decided not to choose one of these strategies, and (c) the players'' initial beliefs about the opponents'' utility functions should agree on a given profile u of utility functions. Types that, throughout the game, respect common belief about these three events, are called persistently rationalizable for the profile u of utility functions. It is shown that persistent rationalizability implies the backward induction procedure in generic games with perfect information. We next focus on persistently rationalizable types for u that hold a theory about the opponents of ``minimal complexity'''', resulting in the concept of minimal rationalizability. For two-player simultaneous move games, minimal rationalizability is equivalent to the concept of Nash equilibrium strategy. In every outside option game, as defined by van Damme (1989), minimal rationalizability uniquely selects the forward induction outcome.microeconomics ;

    An Epistemic Non-Consequentialism

    Get PDF
    Despite the recent backlash against epistemic consequentialism, an explicit systematic alternative has yet to emerge. This paper articulates and defends a novel alternative, Epistemic Kantianism, which rests on a requirement of respect for the truth. §1 tackles some preliminaries concerning the proper formulation of the epistemic consequentialism / non-consequentialism divide, explains where Epistemic Kantianism falls in the dialectical landscape, and shows how it can capture what seems attractive about epistemic consequentialism while yielding predictions that are harder for the latter to secure in a principled way. §2 presents Epistemic Kantianism. §3 argues that it is uniquely poised to satisfy the desiderata set out in §1 on an ideal theory of epistemic justification. §4 gives three further arguments, suggesting that it (i) best explains the objective normative significance of the subject's perspective in epistemology, (ii) follows from the kind of axiology needed to solve the swamping problem together with modest assumptions about the relation between the evaluative and the deontic, and (iii) illuminates certain asymmetries in epistemic value and obligation. §5 takes stock and reassesses the score in the debate
    corecore