298 research outputs found

    Naming the largest number: Exploring the boundary between mathematics and the philosophy of mathematics

    Full text link
    What is the largest number accessible to the human imagination? The question is neither entirely mathematical nor entirely philosophical. Mathematical formulations of the problem fall into two classes: those that fail to fully capture the spirit of the problem, and those that turn it back into a philosophical problem

    Busy beaver machines and the observant otter heuristic

    Get PDF
    The busy beaver problem is to find the maximum number of non-zero characters that can be printed by an n-state Turing machine of a particular type. A critical step in the solution of this problem is to determine whether or not a given n-state Turing machine halts on a blank input. Given the enormous output sizes that can be produced by some small machines, it becomes critical to have appropriate methods for dealing with the exponential behaviour of both terminating and nonterminating machines. In this paper, we investigate a heuristic which can be used to greatly accelerateexecution of this class of machines. This heuristic, which we call the observant otter, is based on the detection of patterns earlier in the execution trace. We describe our implementation of this method and report various experimental results based on it, including showing how it can be used to evaluate all known 'monster' machines, including some whose naive execution would take around 10^36,534 steps

    Capabilities and Limitations of Infinite-Time Computation

    Get PDF
    The relatively new field of infinitary computability strives to characterize thecapabilities and limitations of infinite-time computation; that is, computations ofpotentially transfinite length. Throughout our work, we focus on the prototypicalmodel of infinitary computation: Hamkins and Lewis\u27 infinite-time Turing machine(ITTM), which generalizes the classical Turing machine model in a naturalway.This dissertation adopts a novel approach to this study: whereas most of theliterature, starting with Hamkins and Lewis\u27 debut of the ITTM model, pursuesset-theoretic questions using a set-theoretic approach, we employ arguments thatare truly computational in character. Indeed, we fully utilize analogues of classicalresults from finitary computability, such as the s-m-n Theorem and existence ofuniversal machines, and for the most part, judiciously restrict our attention to theclassical setting of computations over the natural numbers.In Chapter 2 of this dissertation, we state, and derive, as necessary, the aforementionedanalogues of the classical results, as well as some useful constructs for ITTM programming. With this due paid, the subsequent work in Chapters 3 and 4 requires little in the way of programming, and that programming which is required in Chapter 5 is dramatically streamlined. In Chapter 3, we formulate two analogues of one of Rado\u27s busy beaver functions from classical computability, and show, in analogy with Rado\u27s results, that they grow faster than a wide class of infinite-time computable functions. Chapter 4 is tasked with developing a system of ordinal notations via a natural approach involving infinite-time computation, as well as an associated fast-growing hierarchy of functions over the natural numbers. We then demonstrate that the busy beaver functions from Chapter 3 grow faster than the functions which appear in a significant portion of this hierarchy. Finally, we debut, in Chapter 5, two enhancements of the ITTM model whichcan self-modify certain aspects of their underlying software and hardware mid-computation, and show the somewhat surprising fact that, under some reasonableassumptions, these new models of infinitary computation compute precisely thesame functions as the original ITTM model

    Kolmogorov Complexity Theory over the Reals

    Get PDF
    Kolmogorov Complexity constitutes an integral part of computability theory, information theory, and computational complexity theory -- in the discrete setting of bits and Turing machines. Over real numbers, on the other hand, the BSS-machine (aka real-RAM) has been established as a major model of computation. This real realm has turned out to exhibit natural counterparts to many notions and results in classical complexity and recursion theory; although usually with considerably different proofs. The present work investigates similarities and differences between discrete and real Kolmogorov Complexity as introduced by Montana and Pardo (1998)

    A decomposition method for global evaluation of Shannon entropy and local estimations of algorithmic complexity

    Get PDF
    We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based on Solomonoff–Levin’s theory of algorithmic probability providing a closer connection to algorithmic complexity than previous attempts based on statistical regularities such as popular lossless compression schemes. The strategy behind BDM is to find small computer programs that produce the components of a larger, decomposed object. The set of short computer programs can then be artfully arranged in sequence so as to produce the original object. We show that the method provides efficient estimations of algorithmic complexity but that it performs like Shannon entropy when it loses accuracy. We estimate errors and study the behaviour of BDM for different boundary conditions, all of which are compared and assessed in detail. The measure may be adapted for use with more multi-dimensional objects than strings, objects such as arrays and tensors. To test the measure we demonstrate the power of CTM on low algorithmic-randomness objects that are assigned maximal entropy (e.g., π) but whose numerical approximations are closer to the theoretical low algorithmic-randomness expectation. We also test the measure on larger objects including dual, isomorphic and cospectral graphs for which we know that algorithmic randomness is low. We also release implementations of the methods in most major programming languages—Wolfram Language (Mathematica), Matlab, R, Perl, Python, Pascal, C++, and Haskell—and an online algorithmic complexity calculator.Swedish Research Council (Vetenskapsrådet
    • …
    corecore