1,055 research outputs found
The bottom of things: essences for explanation
Central to the philosophy of Aristotle is the belief that the aim of serious
enquiry is knowledge of the constitutive essences of a given field.
Modern scientific essentialism claims that this still holds good, and this
thesis aims to support that approach by elucidating and applying the
original concept of essence. Chapter one argues that Aristotle formulated
his theory of essences entirely in the context of the theory of explanation
expounded in Posterior Analytics. The components of that theory are
explained, and the implications of Aristotle’s view for current debate are
considered. Chapter two examines the reasons for the decline of
Aristotelian essentialism during the scientific revolution, the metaphysical
problems which resulted, and Leibniz’s reasons for defending the older
view. Chapter three considers the nature of explanation in a modern
context, starting with the preconditions for any grasp of reality that are
needed to make explanations possible; it is then argued that only
essentialist explanation can occupy the role which these preconditions
entail. Chapter four surveys the components of that picture of reality that
seem explicable, to see how essentialist explanations would actually be
formulated. The theoretical discussion concludes with an account of
what form essences should take, in order to occupy the explanatory role
that has been assigned to them. The final chapter examines the cases of
counting physical objects, explaining abstract axiomatic systems, and the
discovery of the periodic table of elements, showing how attempts at
explanation in these cases all converge on the sorts of essence which
have been delineated in the thesis
Informal proof, formal proof, formalism
Increases in the use of automated theorem-provers have renewed focus on the relationship between the informal proofs normally found in mathematical research and fully formalised derivations. Whereas some claim that any correct proof will be underwritten by a fully formal proof, sceptics demur. In this paper I look at the relevance of these issues for formalism, construed as an anti-platonistic metaphysical doctrine. I argue that there are strong reasons to doubt that all proofs are fully formalisable, if formal proofs are required to be finitary, but that, on a proper view of the way in which formal proofs idealise actual practice, this restriction is unjustified and formalism is not threatened
Strict finitism, feasibility, and the sorites
This paper bears on four topics: observational predicates and phenomenal properties, vagueness, strict finitism as a philosophy of mathematics, and the analysis of feasible computability. It is argued that reactions to strict finitism point towards a seman- tics for vague predicates in the form of nonstandard models of weak arithmetical theories of the sort originally introduced to characterize the notion of feasibility as understood in computational complexity theory. The approach described eschews the use of non-classical logic and related devices like degrees of truth or supervaluation. Like epistemic approaches to vagueness, it may thus be smoothly integrated with the use of classical model theory as widely employed in natural language semantics. But unlike epistemicism, the described approach fails to imply either the existence of sharp boundaries or the failure of tolerance for soritical predicates. Applications of measurement theory (in the sense of Krantz et al. 1971) to vagueness in the nonstandard setting are also explored
A Semantic Framework for Proof Evidence
International audienceTheorem provers produce evidence of proof in many different formats, such as proof scripts, natural deductions, resolution refutations, Herbrand expansions, and equational rewritings. In implemented provers, numerous variants of such formats are actually used: consider, for example, such variants of or restrictions to resolution refu-tations as binary resolution, hyper-resolution, ordered-resolution, paramodulation, etc. We propose the foundational proof certificates (FPC) framework for defining the semantics of a broad range of proof evidence. This framework allows both producers of proof certificates and the checkers of those certificates to have a clear formal definition of the semantics of a wide variety of proof evidence. Employing the FPC framework will allow one to separate a proof from its provenance and to allow anyone to construct their own proof checker for a given style of proof evidence. The foundation on which FPC relies is that of proof theory, particularly recent work into focused proof systems: such proof systems provide protocols by which a checker extracts information from the certificate (mediated by the so called clerks and experts) as well as performs various deterministic and non-deterministic computations. While we shall limit ourselves to first-order logic in this paper, we shall not limit ourselves in many other ways. The FPC framework is described for both classical and intuitionistic logics and for proof structures as diverse as resolution refutations, natural deduction, Frege proofs, and equality proofs
Toward a Kripkean Concept of Number
Saul Kripke once remarked to me that natural numbers cannot be posits inferred from their indispensability to science, since we’ve always had them. This left me wondering whether numbers are objects of Russellian acquaintance, or accessible by analysis, being implied by known general principles about how to reason correctly, or both. To answer this question, I discuss some recent (and not so recent) work on our concepts of number and of particular numbers, by leading psychologists and philosophers. Special attention is paid to Kripke’s theory that numbers possess structural features of the numerical systems that stand for them, and to the relation between his proposal about numbers and his doctrine that there are contingent truths known a priori. My own proposal, to which Kripke is sympathetic, is that numbers are properties of sets. I argue for this by showing the extent to which it can avoid the problems that plague the various views under discussion, including the problems raised by Kripke against Frege. I also argue that while the terms ‘the number of F’s’, ‘natural number’ and ‘0’, ‘1’, ‘2’ etc. are partially understood by the folk, they can only be fully understood by reflection and analysis, including reflection on how to reason correctly. In this last respect my thesis is a retreat position from logicism. I also show how it dovetails with an account of how numbers are actually grasped in practice, via numerical systems, and in virtue of a certain structural affinity between a geometric pattern that we grasp intuitively, and our fully analyzed concepts of numbers. I argue that none of this involves acquaintance with numbers
Thinking Things Through
A Photcopy of Thinking Things Through, Princeton Univeresity Press, 198
Ontologies on the semantic web
As an informational technology, the World Wide Web has enjoyed spectacular success. In just ten years it has transformed the way information is produced, stored, and shared in arenas as diverse as shopping, family photo albums, and high-level academic research. The “Semantic Web” was touted by its developers as equally revolutionary but has not yet achieved anything like the Web’s exponential uptake. This 17 000 word survey article explores why this might be so, from a perspective that bridges both philosophy and IT
Advances in Proof-Theoretic Semantics
Logic; Mathematical Logic and Foundations; Mathematical Logic and Formal Language
Making sense of genre : The logic of videogame genre organization
Despite the importance that the dimension of genre holds in media studies, its very definition in the field of videogames is still a matter without consensus. This study intends to outline the logic that lies within the constitution of videoludic genres, understanding them as formal devices configured as per the different thought functions stated by Piaget's cognitive psychology theory. This project will propose a formalist approach from a cybersemiotic perspective. It seeks to establish a cardinal set of relations to understand the compatibilities and incompatibilities traceable in the syntactic functional order of the different videogame genres. Furthermore, a corpus of 43 genres is used to prove the solidity of this theoretical approach, oriented to establish foundations for a praxis of the human-machine ludic relation in fields such as game design and media studies, with the performative character of function as a guiding principle
- …