30 research outputs found

    Empirical progress and truth approximation by the 'Hypothetico-Probabilistic Method'

    Get PDF
    Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a 'Hypothetico-Probabilistic (HP-) method', in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate 'concretizations' of their deductive analogs, being 'idealizations'. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison (LC-) method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem

    Approaching probabilistic and deterministic nomic truths in an inductive probabilistic way

    Get PDF
    Theories of truth approximation in terms of truthlikeness (or verisimilitude) almost always deal with (non-probabilistically) approaching deterministic truths, either actual or nomic. This paper deals first with approaching a probabilistic nomic truth, viz. a true probability distribution. It assumes a multinomial probabilistic context, hence with a lawlike true, but usually unknown, probability distribution. We will first show that this true multinomial distribution can be approached by Carnapian inductive probabilities. Next we will deal with the corresponding deterministic nomic truth, that is, the set of conceptually possible outcomes with a positive true probability. We will introduce Hintikkian inductive probabilities, based on a prior distribution over the relevant deterministic nomic theories and on conditional Carnapian inductive probabilities, and first show that they enable again probabilistic approximation of the true distribution. Finally, we will show, in terms of a kind of success theorem, based on Niiniluoto’s estimated distance from the truth, in what sense Hintikkian inductive probabilities enable the probabilistic approximation of the relevant deterministic nomic truth. In sum, the (realist) truth approximation perspective on Carnapian and Hintikkian inductive probabilities leads to the unification of the inductive probability field and the field of truth approximation

    Confirmations and truthlikeness. Reply to Gerhard Schurz

    Get PDF

    Confirmations and truthlikeness. Reply to Gerhard Schurz

    Get PDF

    Arguments Whose Strength Depends on Continuous Variation

    Get PDF
    Both the traditional Aristotelian and modern symbolic approaches to logic have seen logic in terms of discrete symbol processing. Yet there are several kinds of argument whose validity depends on some topological notion of continuous variation, which is not well captured by discrete symbols. Examples include extrapolation and slippery slope arguments, sorites, fuzzy logic, and those involving closeness of possible worlds. It is argued that the natural first attempts to analyze these notions and explain their relation to reasoning fail, so that ignorance of their nature is profound

    Approaching probabilistic truths:introduction to the Topical Collection

    Get PDF
    After Karl Popper’s original work, several approaches were developed to provide a sound explication of the notion of verisimilitude. With few exceptions, these contributions have assumed that the truth to be approximated is deterministic. This collection of ten papers addresses the more general problem of approaching probabilistic truths. They include attempts to find appropriate measures for the closeness to probabilistic truth and to evaluate claims about such distances on the basis of empirical evidence. The papers employ multiple analytical approaches, and connect the research to related issues in the philosophy of science

    Approaching probabilistic laws

    Get PDF
    In the general problem of verisimilitude, we try to define the distance of a statement from a target, which is an informative truth about some domain of investigation. For example, the target can be a state description, a structure description, or a constituent of a first-order language (Sect. 1). In the problem of legisimilitude, the target is a deterministic or universal law, which can be expressed by a nomic constituent or a quantitative function involving the operators of physical necessity and possibility (Sect. 2). The special case of legisimilitude, where the target is a probabilistic law (Sect. 3), has been discussed by Roger Rosenkrantz (Synthese, 1980) and Ilkka Niiniluoto (Truthlikeness, 1987, Ch. 11.5). Their basic proposal is to measure the distance between two probabilistic laws by the Kullback-Leibler notion of divergence, which is a semimetric on the space of probability measures. This idea can be applied to probabilistic laws of coexistence and laws of succession, and the examples may involve discrete or continuous state spaces (Sect. 3). In this paper, these earlier studies are elaborated in four directions (Sect. 4). First, even though deterministic laws are limiting cases of probabilistic laws, the target-sensitivity of truthlikeness measures implies that the legisimilitude of probabilistic laws is not easily reducible to the deterministic case. Secondly, the Jensen-Shannon divergence is applied to mixed probabilistic laws which entail some universal laws. Thirdly, a new class of distance measures between probability distributions is proposed, so that their horizontal differences are taken into account in addition to vertical ones (Sect. 5). Fourthly, a solution is given for the epistemic problem of estimating degrees of probabilistic legisimilitude on the basis of empirical evidence (Sect. 6).Peer reviewe

    A partial consequence account of truthlikeness

    Get PDF
    Popper\u2019s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper\u2019s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial entailment in defining a new measure of truthlikeness which satisfies a number of desiderata. The resulting account has some interesting and surprising connections with other accounts on the market, thus shedding new light on current attempts of systematizing different approaches to verisimilitude
    corecore