836 research outputs found
Formalizing Computability Theory via Partial Recursive Functions
We present an extension to the library of the Lean theorem
prover formalizing the foundations of computability theory. We use primitive
recursive functions and partial recursive functions as the main objects of
study, and we use a constructive encoding of partial functions such that they
are executable when the programs in question provably halt. Main theorems
include the construction of a universal partial recursive function and a proof
of the undecidability of the halting problem. Type class inference provides a
transparent way to supply G\"{o}del numberings where needed and encapsulate the
encoding details.Comment: 16 pages, accepted to ITP 201
Perspectives for proof unwinding by programming languages techniques
In this chapter, we propose some future directions of work, potentially
beneficial to Mathematics and its foundations, based on the recent import of
methodology from the theory of programming languages into proof theory. This
scientific essay, written for the audience of proof theorists as well as the
working mathematician, is not a survey of the field, but rather a personal view
of the author who hopes that it may inspire future and fellow researchers
Variations on the Theme of Conning in Mathematical Economics
The mathematization of economics is almost exclusively in terms of the mathematics of real analysis which, in turn, is founded on set theory (and the axiom of choice) and orthodox mathematical logic. In this paper I try to point out that this kind of mathematization is replete with economic infelicities. The attempt to extract these infelicities is in terms of three main examples: dynamics, policy and rational expectations and learning. The focus is on the role and reliance on standard xed point theorems in orthodox mathematical economics
Synthetic Undecidability and Incompleteness of First-Order Axiom Systems in Coq
We mechanise the undecidability of various frst-order axiom systems in Coq, employing
the synthetic approach to computability underlying the growing Coq Library of Undecidability Proofs. Concretely, we cover both semantic and deductive entailment in fragments
of Peano arithmetic (PA) as well as ZF and related fnitary set theories, with their undecidability established by many-one reductions from solvability of Diophantine equations, i.e.
Hilbert’s tenth problem (H10), and the Post correspondence problem (PCP), respectively.
In the synthetic setting based on the computability of all functions defnable in a constructive foundation, such as Coq’s type theory, it sufces to defne these reductions as metalevel functions with no need for further encoding in a formalised model of computation.
The concrete cases of PA and the considered set theories are supplemented by a general
synthetic theory of undecidable axiomatisations, focusing on well-known connections to
consistency and incompleteness. Specifcally, our reductions rely on the existence of standard models, necessitating additional assumptions in the case of full ZF, and all axiomatic
extensions still justifed by such standard models are shown incomplete. As a by-product of
the undecidability of set theories formulated using only membership and no equality symbol, we obtain the undecidability of frst-order logic with a single binary relation
Computability and analysis: the legacy of Alan Turing
We discuss the legacy of Alan Turing and his impact on computability and
analysis.Comment: 49 page
Revising Type-2 Computation and Degrees of Discontinuity
By the sometimes so-called MAIN THEOREM of Recursive Analysis, every
computable real function is necessarily continuous. Weihrauch and Zheng
(TCS'2000), Brattka (MLQ'2005), and Ziegler (ToCS'2006) have considered
different relaxed notions of computability to cover also discontinuous
functions. The present work compares and unifies these approaches. This is
based on the concept of the JUMP of a representation: both a TTE-counterpart to
the well known recursion-theoretic jump on Kleene's Arithmetical Hierarchy of
hypercomputation: and a formalization of revising computation in the sense of
Shoenfield.
We also consider Markov and Banach/Mazur oracle-computation of discontinuous
fu nctions and characterize the computational power of Type-2 nondeterminism to
coincide with the first level of the Analytical Hierarchy.Comment: to appear in Proc. CCA'0
Uncomputability and Undecidability in Economic Theory
Economic theory, game theory and mathematical statistics have all increasingly become algorithmic sciences. Computable Economics, Algorithmic Game Theory ([28]) and Algorithmic Statistics ([13]) are frontier research subjects. All of them, each in its own way, are underpinned by (classical) recursion theory - and its applied branches, say computational complexity theory or algorithmic information theory - and, occasionally, proof theory. These research paradigms have posed new mathematical and metamathematical questions and, inadvertently, undermined the traditional mathematical foundations of economic theory. A concise, but partial, pathway into these new frontiers is the subject matter of this paper. Interpreting the core of mathematical economic theory to be defined by General Equilibrium Theory and Game Theory, a general - but concise - analysis of the computable and decidable content of the implications of these two areas are discussed. Issues at the frontiers of macroeconomics, now dominated by Recursive Macroeconomic Theory, are also tackled, albeit ultra briefly. The point of view adopted is that of classical recursion theory and varieties of constructive mathematics.General Equilibrium Theory, Game Theory, Recursive Macro-economics, (Un)computability, (Un)decidability, Constructivity
For Cybersecurity, Computer Science Must Rely on the Opposite of Gödel’s Results
This article shows how fundamental higher-order theories of mathematical structures of computer science (e.g. natural numbers [Dedekind 1888] and Actors [Hewitt et. al. 1973]) are cetegorical meaning that they can be axiomatized up to a unique isomorphism thereby removing any ambiguity in the mathematical structures being axiomatized. Having these mathematical structures precisely defined can make systems more secure because there are fewer ambiguities and holes for cyberattackers to exploit. For example, there are no infinite elements in models for natural numbers to be exploited. On the other hand, the 1st-order theories of Gödel’s results necessarily leave the mathematical structures ill-defined, e.g., there are necessarily models with infinite integers.
Cyberattackers have severely damaged national, corporate, and individual security as well causing hundreds of billions of dollars of economic damage. A significant cause of the damage is that current engineering practices are not sufficiently grounded in theoretical principles. In the last two decades, little new theoretical work has been done that practically impacts large engineering projects with the result that computer systems engineering education is insufficient in providing theoretical grounding. If the current cybersecurity situation is not quickly remedied, it will soon become much worse because of the projected development of Scalable Intelligent Systems by 2025 [Hewitt 2019].
Gödel strongly advocated that the Turing Machine is the preeminent universal model of computation. A Turing machine formalizes an algorithm in which computation proceeds without external interaction. However, computing is now highly interactive, which this article proves is beyond the capability of a Turing Machine. Instead of the Turing Machine model, this article presents an axiomatization of a universal model of digital computation (including implementation of Scalable Intelligent Systems) up to a unique isomorphism
- …