227,113 research outputs found

    Facticity as the amount of self-descriptive information in a data set

    Get PDF
    Using the theory of Kolmogorov complexity the notion of facticity {\phi}(x) of a string is defined as the amount of self-descriptive information it contains. It is proved that (under reasonable assumptions: the existence of an empty machine and the availability of a faithful index) facticity is definite, i.e. random strings have facticity 0 and for compressible strings 0 < {\phi}(x) < 1/2 |x| + O(1). Consequently facticity measures the tension in a data set between structural and ad-hoc information objectively. For binary strings there is a so-called facticity threshold that is dependent on their entropy. Strings with facticty above this threshold have no optimal stochastic model and are essentially computational. The shape of the facticty versus entropy plot coincides with the well-known sawtooth curves observed in complex systems. The notion of factic processes is discussed. This approach overcomes problems with earlier proposals to use two-part code to define the meaningfulness or usefulness of a data set.Comment: 10 pages, 2 figure

    Quantum Kolmogorov Complexity Based on Classical Descriptions

    Get PDF
    We develop a theory of the algorithmic information in bits contained in an individual pure quantum state. This extends classical Kolmogorov complexity to the quantum domain retaining classical descriptions. Quantum Kolmogorov complexity coincides with the classical Kolmogorov complexity on the classical domain. Quantum Kolmogorov complexity is upper bounded and can be effectively approximated from above under certain conditions. With high probability a quantum object is incompressible. Upper- and lower bounds of the quantum complexity of multiple copies of individual pure quantum states are derived and may shed some light on the no-cloning properties of quantum states. In the quantum situation complexity is not sub-additive. We discuss some relations with ``no-cloning'' and ``approximate cloning'' properties.Comment: 17 pages, LaTeX, final and extended version of quant-ph/9907035, with corrections to the published journal version (the two displayed equations in the right-hand column on page 2466 had the left-hand sides of the displayed formulas erroneously interchanged

    Strong Turing Degrees for Additive BSS RAM's

    Full text link
    For the additive real BSS machines using only constants 0 and 1 and order tests we consider the corresponding Turing reducibility and characterize some semi-decidable decision problems over the reals. In order to refine, step-by-step, a linear hierarchy of Turing degrees with respect to this model, we define several halting problems for classes of additive machines with different abilities and construct further suitable decision problems. In the construction we use methods of the classical recursion theory as well as techniques for proving bounds resulting from algebraic properties. In this way we extend a known hierarchy of problems below the halting problem for the additive machines using only equality tests and we present a further subhierarchy of semi-decidable problems between the halting problems for the additive machines using only equality tests and using order tests, respectively

    Be My Guest: Normalizing and Compiling Programs using a Host Language

    Get PDF
    In programming language research, normalization is a process of fundamental importance to the theory of computing and reasoning about programs.In practice, on the other hand, compilation is a process that transforms programs in a language to machine code, and thus makes the programminglanguage a usable one. In this thesis, we investigate means of normalizing and compiling programs in a language using another language as the "host".Leveraging a host to work with programs of a "guest" language enables reuse of the host\u27s features that would otherwise be strenuous to develop.The specific tools of interest are Normalization by Evaluation and Embedded Domain-Specific Languages, both of which rely on a host language for their purposes. These tools are applied to solve problems in three different domains: to show that exponentials (or closures) can be eliminated from a categorical combinatory calculus, to propose a new proof technique based on normalization for showing noninterference, and to enable the programming of resource-constrained IoT devices from Haskell
    • …
    corecore