46 research outputs found
Varieties of Mathematics in Economics- A Partial View
Real analysis, founded on the Zermelo-Fraenkel axioms, buttressed by the axiom of choice, is the dominant variety of mathematics utilized in the formalization of economic theory. The accident of history that led to this dominance is not inevitable, especially in an age when the digital computer seems to be ubiquitous in research, teaching and learning. At least three other varieties of mathematics, each underpinned by its own mathematical logic, have come to be used in the formalization of mathematics in more recent years. To set theory, model theory, proof theory and recursion theory correspond, roughly speaking, real analysis, non-standard analysis, constructive analysis and computable analysis. These other varieties, we claim, are more consistent with the intrinsic nature and ontology of economic concepts. In this paper we discuss aspects of the way real analysis dominates the mathematical formalization of economic theory and the prospects for overcoming this dominance.
The Problem of âUltimate Groundingâ in the Perspective of Hegelâs Logic
What corresponds to the present-day âtranscendental-pragmaticâ concept of ultimate grounding in Hegel is his claim to absoluteness of the logic. Hegelâs fundamental intuition is that of a âbackward going groundingâ obtaining the initially unproved presuppositions, thereby âwrapping itself into a circleâ â the project of the self-grounding of logic, understood as the self-explication of logic by logical means. Yet this is not about one of the multiple âlogicsâ which as formal constructs cannot claim absoluteness. It is rather a fundamental logic that only makes logical textures possible at all and so owns transcendental character. The non-contradiction-principle is an example for this. Es-
sential is that it is âunder-cover-effcientâ as soon as meaningful concepts are used. Self-explication of the fundamental logic then means explicating its implicit under-cover validity, in fact by means of the fundamental logic itself. As is shown this is the affair of dialectic which thereby is to be understood as ultimate grounding of the fundamental logic. This is analyzed in detail using the example of the being/non-being-dialectic. As is demonstrated each explication step generates a new implicit issue and therewith a new explication-discrepancy inducing an antinomical structure that anew forwards the explication procedure. So this is entirely determined by itself. Decisive for the
ultimate grounding argumentation is that thereby an objectively verifyable procedure is found, which is apparently possible only in a Hegelian framework. In contrast the immediate evidence of a speech act claimed by the transcendental-pragmatic position has only private character, which is grounding-theoretically irrelevant
To the Beat of Different Drumer....Freedom, Anarchy and Conformism in Research
In this paper I attempt to make a case for promoting the courage of rebels within the citadels of orthodoxy in academic research environments. Wicksell in Macroeconomics, Brouwer in the Foundations of Mathematics,Turing in Computability Theory, Sraffa in the Theories of Value and Distribution are, in my own fields of research, paradigmatic examples of rebels, adventurers and non-conformists of the highest calibre in scientific research within University environments. In what sense, and how, can such rebels, adventurers and nonconformists be fostered in the current University research environment dominated by the cult of picking winners? This is the motivational question lying behind the historical outlines of the work of Wicksell, Brouwer, Hilbert, Bishop, Veronese, Gödel, Turing and Sraffa that I describe in this paper. The debate between freedom in research and teaching and the naked imposition of correct thinking, on potential dissenters of the mind, is of serious concern in this age of austerity of material facilities. It is a debate that has occupied some the finest minds working at the deepest levels of foundational issues in mathematics, metamathematics and economic theory. By making some of the issues explicit, I hope it is possible to encourage dissenters to remain courageous in the face of current dogmas.Non-conformist research, macroeconomics, foundations of mathematics, intuitionism, constructivism, formalism, HilbertĂs Dogma, HilbertĂs Program, computability theory
Computability and Algorithmic Complexity in Economics
This is an outline of the origins and development of the way computability theory and algorithmic complexity theory were incorporated into economic and finance theories. We try to place, in the context of the development of computable economics, some of the classics of the subject as well as those that have, from time to time, been credited with having contributed to the advancement of the field. Speculative thoughts on where the frontiers of computable economics are, and how to move towards them, conclude the paper. In a precise sense - both historically and analytically - it would not be an exaggeration to claim that both the origins of computable economics and its frontiers are defined by two classics, both by Banach and Mazur: that one page masterpiece by Banach and Mazur ([5]), built on the foundations of Turingâs own classic, and the unpublished Mazur conjecture of 1928, and its unpublished proof by Banach ([38], ch. 6 & [68], ch. 1, #6). For the undisputed original classic of computable economics is RabinĂs effectivization of the Gale-Stewart game ([42];[16]); the frontiers, as I see them, are defined by recursive analysis and constructive mathematics, underpinning computability over the computable and constructive reals and providing computable foundations for the economistâs Marshallian penchant for curve-sketching ([9]; [19]; and, in general, the contents of Theoretical Computer Science, Vol. 219, Issue 1-2). The former work has its roots in the Banach-Mazur game (cf. [38], especially p.30), at least in one reading of it; the latter in ([5]), as well as other, earlier, contributions, not least by Brouwer.
Weak axioms of determinacy and subsystems of analysis II (â02 games)
AbstractIn [10], we have shown that the statement that all â11 partitions are Ramsey is deducible over ATR0 from the axiom of â11 monotone inductive definition,but the reversal needs Đ11-CA0 rather than ATR0. By contrast, we show in this paper that the statement that all â02 games are determinate is also deducible over ATR0 from the axiom of â11 monotone inductive definition, but the reversal is provable even in ACA0. These results illuminate the substantial differences among lightface theorems which can not be observed in boldface
Anti-Foundational Categorical Structuralism
The aim of this dissertation is to outline and defend the view here dubbed âanti-foundational categorical structuralismâ (henceforth AFCS). The program put forth is intended to provide an answer the question âwhat is mathematics?â. The answer here on offer adopts the structuralist view of mathematics, in that mathematics is taken to be âthe science of structureâ expressed in the language of category theory, which is argued to accurately capture the notion of a âstructural propertyâ. In characterizing mathematical theorems as both conditional and schematic in form, the program is forced to give up claims to securing the truth of its theorems, as well as give up a semantics which involves reference to special, distinguished âmathematical objectsâ, or which involves quantification over a fixed domain of such objects. One who wishesâcontrary to the AFCS viewâto inject mathematics with a âstandardâ semantics, and to provide a secure epistemic foundation for the theorems of mathematics, in short, one who wishes for a foundation for mathematics, will surely find this view lacking. However, I argue that a satisfactory development of the structuralist view, couched in the language of category theory, accurately represents our best understanding of the content of mathematical theorems and thereby obviates the need for any foundational program
Truth, Semantic Closure, and Conditionals
Almost all theories of truth place limits on the expressive power of languages containing truth predicates. Such theories have been criticized as inadequate on the grounds that these limitations are illegitimate. These criticisms set up several requirements on theories of truth. My initial focus is on the criticisms and why their requirements should be accepted.
I argue that an adequate theory of truth should validate intuitive arguments involving truth and respect intuitive evaluations of the semantic statuses of sentences. From this starting point, I analyze the arguments in favor of several common requirements on theories of truth and formulate some minimal requirements on theories of truth. One is a logic neutrality requirement that says that a theory must be compatible with a range of logical resources, such as different negations. Another is the requirement that the theory validate certain laws governing truth, such as the T-sentences.
These two requirements rule out many theories of truth.
The main problem is that many theories lack an adequate conditional, the addition of which is, in fact, precluded by those theories.
I argue that the revision theory of truth can satisfy my criteria when augmented with a pair of conditionals, which are defined using a modification of the framework of circular definitions of the revision theory. I distinguish two roles for conditionals in theories of truth and argue that the conditionals of the proposed theory fill those roles well. The conditionals are interdefinable with a modal operator. I prove a completeness theorem for the calculus of \emph{The Revision Theory of Truth} modified with rules for this operator. I examine the modal logic of this operator and prove a Solovay-type completeness theorem linking the modal logic and a certain class of circular definitions.
I conclude by examining Field's recent theory of truth with its new conditional.
I argue that Field's theory does not meet my requirements and that it fails to vindicate some of Field's own philosophical views. I close by proposing a framework for studying Field's conditional apart from his canonical models