9 research outputs found

    Some Thoughts on Hypercomputation

    Full text link
    Hypercomputation is a relatively new branch of computer science that emerged from the idea that the Church--Turing Thesis, which is supposed to describe what is computable and what is noncomputable, cannot possible be true. Because of its apparent validity, the Church--Turing Thesis has been used to investigate the possible limits of intelligence of any imaginable life form, and, consequently, the limits of information processing, since living beings are, among others, information processors. However, in the light of hypercomputation, which seems to be feasibly in our universe, one cannot impose arbitrary limits to what intelligence can achieve unless there are specific physical laws that prohibit the realization of something. In addition, hypercomputation allows us to ponder about aspects of communication between intelligent beings that have not been considered befor

    Specification, Testing and Verification of Unconventional Computations using Generalised X-Machines

    Get PDF
    There are as yet no fully comprehensive techniques for specifying, verifying and testing unconventional computations. In this paper we propose a generally applicable and designer-friendly specification strategy based on a generalised variant of Eilenberg's X-machine model of computation. Our approach, which extends existing approaches to SXM test-based verification, is arguably capable of modelling very general unconventional computations, and would allow implementations to be verified fully against their specifications

    How is there a Physics of Information? On characterising physical evolution as information processing.

    Get PDF
    We have a conundrum. The physical basis of information is clearly a highly active research area. Yet the power of information theory comes precisely from separating it from the detailed problems of building physical systems to perform information processing tasks. Developments in quantum information over the last two decades seem to have undermined this separation, leading to suggestions that information is itself a physical entity and must be part of our physical theories, with resource-cost implications. We will consider a variety of ways in which physics seems to a affect computation, but will ultimately argue to the contrary: rejecting the claims that information is physical provides a better basis for understanding the fertile relationship between information theory and physics. instead, we will argue that the physical resource costs of information processing are to be understood through the need to consider physically embodied agents for whom information processing tasks are performed. Doing so sheds light on what it takes for something to be implementing a computational or information processing task of a given kind

    How is there a Physics of Information? On characterising physical evolution as information processing.

    Get PDF
    We have a conundrum. The physical basis of information is clearly a highly active research area. Yet the power of information theory comes precisely from separating it from the detailed problems of building physical systems to perform information processing tasks. Developments in quantum information over the last two decades seem to have undermined this separation, leading to suggestions that information is itself a physical entity and must be part of our physical theories, with resource-cost implications. We will consider a variety of ways in which physics seems to a affect computation, but will ultimately argue to the contrary: rejecting the claims that information is physical provides a better basis for understanding the fertile relationship between information theory and physics. instead, we will argue that the physical resource costs of information processing are to be understood through the need to consider physically embodied agents for whom information processing tasks are performed. Doing so sheds light on what it takes for something to be implementing a computational or information processing task of a given kind

    The development of computer science a sociocultural perspective

    Get PDF

    The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    The Significance of Evidence-based Reasoning in Mathematics, Mathematics Education, Philosophy, and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    Abstract A Hypercomputational Alien ⋆

    No full text
    Is there a physical constant with the value of the halting function? An answer to this question, as in other discussions of hypercomputation, assumes a fixed interpretation of nature by mathematical entities. Without agreeing on such an interpretation, the question is without context and meaningless. We discuss the subjectiveness of viewing the mathematical properties of nature, and the possibility of comparing computational models having alternate views of the world. For that purpose, we propose a conceptual framework for power comparison, by linking computational models to hypothetical physical devices. Accordingly, we deduce some mathematical notions of relative computational power, allowing for the comparison of arbitrary models over arbitrary domains. In addition, we demonstrate that the method commonly used in the literature for establishing that one model is strictly more powerful than another is problematic, as it can allow for a model to be “more powerful ” than itself. On the positive side, we note that Turing machines and the recursive functions are not susceptible to this anomaly, justifying the standard means of showing that a model is more powerful than Turing machines
    corecore