9,877 research outputs found

    On Hilbert's Tenth Problem

    Full text link
    Using an iterated Horner schema for evaluation of diophantine polynomials, we define a partial μ\mu-recursive "decision" algorithm decis as a "race" for a first nullstelle versus a first (internal) proof of non-nullity for such a polynomial -- within a given theory T extending Peano Arithmetique PA. If T is diophantine sound, i.e., if (internal) provability implies truth -- for diophantine formulae --, then the T-map decis gives correct results when applied to the codes of polynomial inequalities D(x1,...,xm)≠0D(x_1,...,x_m) \neq 0. The additional hypothesis that T be diophantine complete (in the syntactical sense) would guarantee in addition termination of decis on these formula, i.e., decis would constitute a decision algorithm for diophantine formulae in the sense of Hilbert's 10th problem. From Matiyasevich's impossibility for such a decision it follows, that a consistent theory T extending PA cannot be both diophantine sound and diophantine complete. We infer from this the existence of a diophantine formulae which is undecidable by T. Diophantine correctness is inherited by the diophantine completion T~ of T, and within this extension decis terminates on all externally given diophantine polynomials, correctly. Matiyasevich's theorem -- for the strengthening T~ of T -- then shows that T~, and hence T, cannot be diophantine sound. But since the internal consistency formula Con_T for T implies -- within PA -- diophantine soundness of T, we get that PA derives \neg Con_T, in particular PA must derive its own internal inconsistency formula

    Computational Processes and Incompleteness

    Full text link
    We introduce a formal definition of Wolfram's notion of computational process based on cellular automata, a physics-like model of computation. There is a natural classification of these processes into decidable, intermediate and complete. It is shown that in the context of standard finite injury priority arguments one cannot establish the existence of an intermediate computational process

    A Primer on the Tools and Concepts of Computable Economics

    Get PDF
    Computability theory came into being as a result of Hilbert's attempts to meet Brouwer's challenges, from an intuitionistc and constructive standpoint, to formalism as a foundation for mathematical practice. Viewed this way, constructive mathematics should be one vision of computability theory. However, there are fundamental differences between computability theory and constructive mathematics: the Church-Turing thesis is a disciplining criterion in the former and not in the latter; and classical logic - particularly, the law of the excluded middle - is not accepted in the latter but freely invoked in the former, especially in proving universal negative propositions. In Computable Economic an eclectic approach is adopted where the main criterion is numerical content for economic entities. In this sense both the computable and the constructive traditions are freely and indiscriminately invoked and utilised in the formalization of economic entities. Some of the mathematical methods and concepts of computable economics are surveyed in a pedagogical mode. The context is that of a digital economy embedded in an information society

    Hilbert's Program Then and Now

    Get PDF
    Hilbert's program was an ambitious and wide-ranging project in the philosophy and foundations of mathematics. In order to "dispose of the foundational questions in mathematics once and for all, "Hilbert proposed a two-pronged approach in 1921: first, classical mathematics should be formalized in axiomatic systems; second, using only restricted, "finitary" means, one should give proofs of the consistency of these axiomatic systems. Although Godel's incompleteness theorems show that the program as originally conceived cannot be carried out, it had many partial successes, and generated important advances in logical theory and meta-theory, both at the time and since. The article discusses the historical background and development of Hilbert's program, its philosophical underpinnings and consequences, and its subsequent development and influences since the 1930s.Comment: 43 page

    Notes on the Mathematical Foundations of Analogue Computation

    Get PDF
    Digital computing has its mathematical foundations in (classical) recursion theory and constructive mathematics. The implicit, working, assumption of those who practice the noble art of analog computing may well be that the mathematical foundations of their subject is as sound as the foundations of the real analysis. That, in turn, implies a reliance on the soundness of set theory plus the axiom of choice. This is, surely, seriously disturbing from a computation point of view. Therefore, in this paper, I seek to locate a foundation for analog computing in exhibiting some tentative dualities with results that are analogous to those that are standard in computability theory. The main question, from the point of view of economics, is whether the Phillips Machine, as an analog computer, has universal computing properties. The conjectured answer is in the negative.

    On the mathematical and foundational significance of the uncountable

    Full text link
    We study the logical and computational properties of basic theorems of uncountable mathematics, including the Cousin and Lindel\"of lemma published in 1895 and 1903. Historically, these lemmas were among the first formulations of open-cover compactness and the Lindel\"of property, respectively. These notions are of great conceptual importance: the former is commonly viewed as a way of treating uncountable sets like e.g. [0,1][0,1] as 'almost finite', while the latter allows one to treat uncountable sets like e.g. R\mathbb{R} as 'almost countable'. This reduction of the uncountable to the finite/countable turns out to have a considerable logical and computational cost: we show that the aforementioned lemmas, and many related theorems, are extremely hard to prove, while the associated sub-covers are extremely hard to compute. Indeed, in terms of the standard scale (based on comprehension axioms), a proof of these lemmas requires at least the full extent of second-order arithmetic, a system originating from Hilbert-Bernays' Grundlagen der Mathematik. This observation has far-reaching implications for the Grundlagen's spiritual successor, the program of Reverse Mathematics, and the associated G\"odel hierachy. We also show that the Cousin lemma is essential for the development of the gauge integral, a generalisation of the Lebesgue and improper Riemann integrals that also uniquely provides a direct formalisation of Feynman's path integral.Comment: 35 pages with one figure. The content of this version extends the published version in that Sections 3.3.4 and 3.4 below are new. Small corrections/additions have also been made to reflect new development
    • …
    corecore