9 research outputs found

    On interpreting Chaitin's incompleteness theorem

    Get PDF
    The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin’s famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good measure of the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental

    Did Gödel prove that we are not machines? (On philosophical consequences of Gödel's theorem)

    Get PDF
    Gödel's incompleteness theorem has been the most famous example of a mathematical theorem from which deep philosophical consequences follow. They are said to give an insight, first, into the nature of mathematics, and more generally of human knowledge, and second, into the nature of the mind. The limitations of logicist or formalist programmes of mathematics have had a clear significance against the background of the foundational schools of the early decades of this century. The limitations of mechanism, or of the vision underlying research in the ïŹeld of ArtiïŹcial Inteligence, gain significance only now. Yet, while the limitations imposed by Gödel's theorem upon the extent of formal methods seem unquestionable they seem to have very little to say about the restrictions concerning mathematical or computer practice. And the alleged consequences concerning the non-mechanical character of human mind are questionable. The standard reasoning, known as Lucas' argument, begs the question, and actually implies that Lucas is inconsistent

    Information-Theoretic Incompleteness

    Full text link

    Complexity Theory as a Paradigm for the Dynamical Law-and-Society System: A Wake-up Call for Legal Reductionism and the Modern Administrative State

    Get PDF
    This article is the first in my series of articles exploring the application of complex adaptive systems (CAS) theory to legal systems. It builds the basic model of CAS and maps it onto legal systems, offering some suggestions for what it means in terms of legal institution and instrument design

    An Algorithmic Interpretation of Quantum Probability

    Get PDF
    The Everett (or relative-state, or many-worlds) interpretation of quantum mechanics has come under fire for inadequately dealing with the Born rule (the formula for calculating quantum probabilities). Numerous attempts have been made to derive this rule from the perspective of observers within the quantum wavefunction. These are not really analytic proofs, but are rather attempts to derive the Born rule as a synthetic a priori necessity, given the nature of human observers (a fact not fully appreciated even by all of those who have attempted such proofs). I show why existing attempts are unsuccessful or only partly successful, and postulate that Solomonoff's algorithmic approach to the interpretation of probability theory could clarify the problems with these approaches. The Sleeping Beauty probability puzzle is used as a springboard from which to deduce an objectivist, yet synthetic a priori framework for quantum probabilities, that properly frames the role of self-location and self-selection (anthropic) principles in probability theory. I call this framework "algorithmic synthetic unity" (or ASU). I offer no new formal proof of the Born rule, largely because I feel that existing proofs (particularly that of Gleason) are already adequate, and as close to being a formal proof as one should expect or want. Gleason's one unjustified assumption--known as noncontextuality--is, I will argue, completely benign when considered within the algorithmic framework that I propose. I will also argue that, to the extent the Born rule can be derived within ASU, there is no reason to suppose that we could not also derive all the other fundamental postulates of quantum theory, as well. There is nothing special here about the Born rule, and I suggest that a completely successful Born rule proof might only be possible once all the other postulates become part of the derivation. As a start towards this end, I show how we can already derive the essential content of the fundamental postulates of quantum mechanics, at least in outline, and especially if we allow some educated and well-motivated guesswork along the way. The result is some steps towards a coherent and consistent algorithmic interpretation of quantum mechanics

    The use of information concepts in the dialogue between science and theology

    Get PDF
    We are living in the information age and this has had an effect on both science and theology. Our understanding of the fundamental role of information has increased significantly. One can even say that information has become an overarching metaphor in the world of science. This dissertation gives an overview of the impact of the information-based scientific world-view on the dialogue between science and theology. The study investigates the metaphorical use of information concepts to secure a better understanding of God's action in the world and the role that information plays in the processes of life. The focus is on the role of biological information, and its relation to divine action is investigated. The scientific importance of information and the possible impact of information concepts on the science and theology dialogue of the future are discussed.Philosophy, Practical and Systematic TheologyM. Th.(Systematic Theology
    corecore