9 research outputs found

    The Epistemic Significance of Valid Inference – A Model-Theoretic Approach

    Get PDF
    The problem analysed in this paper is whether we can gain knowledge by using valid inferences, and how we can explain this process from a model-theoretic perspective. According to the paradox of inference (Cohen & Nagel 1936/1998, 173), it is logically impossible for an inference to be both valid and its conclusion to possess novelty with respect to the premises. I argue in this paper that valid inference has an epistemic significance, i.e., it can be used by an agent to enlarge his knowledge, and this significance can be accounted in model-theoretic terms. I will argue first that the paradox is based on an equivocation, namely, it arises because logical containment, i.e., logical implication, is identified with epistemological containment, i.e., the knowledge of the premises entails the knowledge of the conclusion. Second, I will argue that a truth-conditional theory of meaning has the necessary resources to explain the epistemic significance of valid inferences. I will explain this epistemic significance starting from Carnap’s semantic theory of meaning and Tarski’s notion of satisfaction. In this way I will counter (Prawitz 2012b)’s claim that a truth-conditional theory of meaning is not able to account the legitimacy of valid inferences, i.e., their epistemic significance

    What Makes Logical Truths True?

    Get PDF
    The concern of deductive logic is generally viewed as the systematic recognition of logical principles, i.e., of logical truths. This paper presents and analyzes different instantiations of the three main interpretations of logical principles, viz. as ontological principles, as empirical hypotheses, and as true propositions in virtue of meanings. I argue in this paper that logical principles are true propositions in virtue of the meanings of the logical terms within a certain linguistic framework. Since these principles also regulate and control the process of deduction in inquiry, i.e., they are prescriptive for the use of language and thought in inquiry, I argue that logic may, and should, be seen as an instrument or as a way of proceeding (modus procedendi) in inquiry

    More Reflections on Consequence

    Get PDF
    This special issue collects together nine new essays on logical consequence :the relation obtaining between the premises and the conclusion of a logically valid argument. The present paper is a partial, and opinionated,introduction to the contemporary debate on the topic. We focus on two influential accounts of consequence, the model-theoretic and the proof-theoretic, and on the seeming platitude that valid arguments necessarilypreserve truth. We briefly discuss the main objections these accounts face, as well as Hartry Field’s contention that such objections show consequenceto be a primitive, indefinable notion, and that we must reject the claim that valid arguments necessarily preserve truth. We suggest that the accountsin question have the resources to meet the objections standardly thought to herald their demise and make two main claims: (i) that consequence, as opposed to logical consequence, is the epistemologically significant relation philosophers should be mainly interested in; and (ii) that consequence is a paradoxical notion if truth is

    On the Computational Meaning of Axioms

    No full text
    An anti-realist theory of meaning suitable for both logical and proper axioms is investigated. As opposed to other anti-realist accounts, like Dummett-Prawitz verificationism, the standard framework of classical logic is not called into question. In particular, semantical features are not limited solely to inferential ones, but also computational aspects play an essential role in the process of determination of meaning. In order to deal with such computational aspects, a relaxation of syntax is shown to be necessary. This leads to a general kind of proof theory, where the objects of study are not typed objects like deductions, but rather untyped ones, in which formulas have been replaced by geometrical configurations

    Groundwork for a Fallibilist Account of Mathematics

    Get PDF
    According to the received view, genuine mathematical justification derives from proofs. In this article, I challenge this view. First, I sketch a notion of proof that cannot be reduced to deduction from the axioms but rather is tailored to human agents. Secondly, I identify a tension between the received view and mathematical practice. In some cases, cognitively diligent, well-functioning mathematicians go wrong. In these cases, it is plausible to think that proof sets the bar for justification too high. I then propose a fallibilist account of mathematical justification. I show that the main function of mathematical justification is to guarantee that the mathematical community can correct the errors that inevitably arise from our fallible practices

    Epistemic Characterizations of Validity and Level-Bridging Principles

    Get PDF
    How should we understand validity? A standard way to characterize validity is in terms of the preservation of truth (or truth in a model). But there are several problems facing such characterizations. An alternative approach is to characterize validity epistemically, for instance in terms of the preservation of an epistemic status. In this paper, I raise a problem for such views. First, I argue that if the relevant epistemic status is factive, such as being in a position to know or having conclusive evidence for, then the account runs into trouble if we endorse certain familiar logical principles. Second, I argue that if the relevant epistemic status is non-factive, such as is rationally committed to or has justification for believing, then a similar problem arises if we endorse the logical principles as well as a sufficiently strong epistemic "level-bridging" principle. Finally, I argue that an analogous problem arises for the most natural characterization of validity in terms of rational credence

    Historical and Conceptual Foundations of Information Physics

    Get PDF
    The main objective of this dissertation is to philosophically assess how the use of informational concepts in the field of classical thermostatistical physics has historically evolved from the late 1940s to the present day. I will first analyze in depth the main notions that form the conceptual basis on which 'informational physics' historically unfolded, encompassing (i) different entropy, probability and information notions, (ii) their multiple interpretative variations, and (iii) the formal, numerical and semantic-interpretative relationships among them. In the following, I will assess the history of informational thermophysics during the second half of the twentieth century. Firstly, I analyse the intellectual factors that gave rise to this current in the late forties (i.e., popularization of Shannon's theory, interest in a naturalized epistemology of science, etc.), then study its consolidation in the Brillouinian and Jaynesian programs, and finally claim how Carnap (1977) and his disciples tried to criticize this tendency within the scientific community. Then, I evaluate how informational physics became a predominant intellectual current in the scientific community in the nineties, made possible by the convergence of Jaynesianism and Brillouinism in proposals such as that of Tribus and McIrvine (1971) or Bekenstein (1973) and the application of algorithmic information theory into the thermophysical domain. As a sign of its radicality at this historical stage, I explore the main proposals to include information as part of our physical reality, such as Wheeler’s (1990), Stonier’s (1990) or Landauer’s (1991), detailing the main philosophical arguments (e.g., Timpson, 2013; Lombardi et al. 2016a) against those inflationary attitudes towards information. Following this historical assessment, I systematically analyze whether the descriptive exploitation of informational concepts has historically contributed to providing us with knowledge of thermophysical reality via (i) explaining thermal processes such as equilibrium approximation, (ii) advantageously predicting thermal phenomena, or (iii) enabling understanding of thermal property such as thermodynamic entropy. I argue that these epistemic shortcomings would make it impossible to draw ontological conclusions in a justified way about the physical nature of information. In conclusion, I will argue that the historical exploitation of informational concepts has not contributed significantly to the epistemic progress of thermophysics. This would lead to characterize informational proposals as 'degenerate science' (à la Lakatos 1978a) regarding classical thermostatistical physics or as theoretically underdeveloped regarding the study of the cognitive dynamics of scientists in this physical domain
    corecore