1,084 research outputs found
The descriptive theory of represented spaces
This is a survey on the ongoing development of a descriptive theory of
represented spaces, which is intended as an extension of both classical and
effective descriptive set theory to deal with both sets and functions between
represented spaces. Most material is from work-in-progress, and thus there may
be a stronger focus on projects involving the author than an objective survey
would merit.Comment: survey of work-in-progres
Computational reverse mathematics and foundational analysis
Reverse mathematics studies which subsystems of second order arithmetic are
equivalent to key theorems of ordinary, non-set-theoretic mathematics. The main
philosophical application of reverse mathematics proposed thus far is
foundational analysis, which explores the limits of different foundations for
mathematics in a formally precise manner. This paper gives a detailed account
of the motivations and methodology of foundational analysis, which have
heretofore been largely left implicit in the practice. It then shows how this
account can be fruitfully applied in the evaluation of major foundational
approaches by a careful examination of two case studies: a partial realization
of Hilbert's program due to Simpson [1988], and predicativism in the extended
form due to Feferman and Sch\"{u}tte.
Shore [2010, 2013] proposes that equivalences in reverse mathematics be
proved in the same way as inequivalences, namely by considering only
-models of the systems in question. Shore refers to this approach as
computational reverse mathematics. This paper shows that despite some
attractive features, computational reverse mathematics is inappropriate for
foundational analysis, for two major reasons. Firstly, the computable
entailment relation employed in computational reverse mathematics does not
preserve justification for the foundational programs above. Secondly,
computable entailment is a complete relation, and hence employing it
commits one to theoretical resources which outstrip those available within any
foundational approach that is proof-theoretically weaker than
.Comment: Submitted. 41 page
On the mathematical and foundational significance of the uncountable
We study the logical and computational properties of basic theorems of
uncountable mathematics, including the Cousin and Lindel\"of lemma published in
1895 and 1903. Historically, these lemmas were among the first formulations of
open-cover compactness and the Lindel\"of property, respectively. These notions
are of great conceptual importance: the former is commonly viewed as a way of
treating uncountable sets like e.g. as 'almost finite', while the
latter allows one to treat uncountable sets like e.g. as 'almost
countable'. This reduction of the uncountable to the finite/countable turns out
to have a considerable logical and computational cost: we show that the
aforementioned lemmas, and many related theorems, are extremely hard to prove,
while the associated sub-covers are extremely hard to compute. Indeed, in terms
of the standard scale (based on comprehension axioms), a proof of these lemmas
requires at least the full extent of second-order arithmetic, a system
originating from Hilbert-Bernays' Grundlagen der Mathematik. This observation
has far-reaching implications for the Grundlagen's spiritual successor, the
program of Reverse Mathematics, and the associated G\"odel hierachy. We also
show that the Cousin lemma is essential for the development of the gauge
integral, a generalisation of the Lebesgue and improper Riemann integrals that
also uniquely provides a direct formalisation of Feynman's path integral.Comment: 35 pages with one figure. The content of this version extends the
published version in that Sections 3.3.4 and 3.4 below are new. Small
corrections/additions have also been made to reflect new development
Computability of Probability Distributions and Distribution Functions
We define the computability of probability distributions on the real line as well as that of distribution functions. Mutual relationships between the computability notion of a probability distribution and that of the corresponding distribution function are discussed. It is carried out through attempts to effectivize some classical fundamental theorems concerning probability distributions. We then define the effective convergence of probability distributions as an effectivization of the classical vague convergence. For distribution functions, computability and effective convergence are naturally defined as real functions. A weaker effective convergence is also defined as an effectivization of pointwise convergence
- …