21 research outputs found

    Computing and evolving variants of computational depth

    Get PDF
    The structure and organization of information in binary strings and (infinite) binary sequences are investigated using two computable measures of complexity related to computational depth. First, fundamental properties of recursive computational depth, a refinement of Bennett\u27s original notion of computational depth, are developed, and it is shown that the recursively weakly (respectively, strongly) deep sequences form a proper subclass of the class of weakly (respectively, strongly) deep sequences. It is then shown that every weakly useful sequence is recursively strongly deep, strengthening a theorem by Juedes, Lathrop, and Lutz. It follows from these results that not every strongly deep sequence is weakly useful, thereby answering an open question posed by Juedes;Second, compression depth, a feasibly computable depth measurement, is developed based on the Lempel-Ziv compression algorithm. LZ compression depth is further formalized by introducing strongly (compression) deep sequences and showing that analogues of the main properties of computational depth hold for compression depth. Critical to these results, it is shown that a sequence that is not normal must be compressible by the Lempei-Ziv algorithm. This yields a new, simpler proof that the Champernowne sequence is normal;Compression depth is also used to measure the organization of genes in genetic algorithms. Using finite-state machines to control the actions of an automaton playing prisoner\u27s dilemma, a genetic algorithm is used to evolve a population of finite-state machines (players) to play prisoner\u27s dilemma against each other. Since the fitness function is based solely on how well a player performs against all other players in the population, any accumulation of compression depth (organization) in the genetic structure of the player can only by attributed to the fact that more fit players have a more highly organized genetic structure. It is shown experimentally that this is the case

    A General Notion of Useful Information

    Full text link
    In this paper we introduce a general framework for defining the depth of a sequence with respect to a class of observers. We show that our general framework captures all depth notions introduced in complexity theory so far. We review most such notions, show how they are particular cases of our general depth framework, and review some classical results about the different depth notions

    Depth, Highness and DNR degrees

    Get PDF
    We study Bennett deep sequences in the context of recursion theory; in particular we investigate the notions of O(1)-deepK, O(1)-deepC , order-deep K and order-deep C sequences. Our main results are that Martin-Loef random sets are not order-deepC , that every many-one degree contains a set which is not O(1)-deepC , that O(1)-deepC sets and order-deepK sets have high or DNR Turing degree and that no K-trival set is O(1)-deepK.Comment: journal version, dmtc

    Separations of Non-monotonic Randomness Notions

    Get PDF
    In the theory of algorithmic randomness, several notions of random sequence are defined via a game-theoretic approach, and the notions that received most attention are perhaps Martin-L"of randomness and computable randomness. The latter notion was introduced by Schnorr and is rather natural: an infinite binary sequence is computably random if no total computable strategy succeeds on it by betting on bits in order. However, computably random sequences can have properties that one may consider to be incompatible with being random, in particular, there are computably random sequences that are highly compressible. The concept of Martin-L"of randomness is much better behaved in this and other respects, on the other hand its definition in terms of martingales is considerably less natural. Muchnik, elaborating on ideas of Kolmogorov and Loveland, refined Schnorr\u27s model by also allowing non-monotonic strategies, i.e. strategies that do not bet on bits in order. The subsequent ``non-monotonic\u27\u27 notion of randomness, now called Kolmogorov-Loveland-randomness, has been shown to be quite close to Martin-L"of randomness, but whether these two classes coincide remains a fundamental open question. In order to get a better understanding of non-monotonic randomness notions, Miller and Nies introduced some interesting intermediate concepts, where one only allows non-adaptive strategies, i.e., strategies that can still bet non-monotonically, but such that the sequence of betting positions is known in advance (and computable). Recently, these notions were shown by Kastermans and Lempp to differ from Martin-L"of randomness. We continue the study of the non-monotonic randomness notions introduced by Miller and Nies and obtain results about the Kolmogorov complexities of initial segments that may and may not occur for such sequences, where these results then imply a complete classification of these randomness notions by order of strength

    Depth as randomness deficiency

    Get PDF

    Depth as Randomness Deficiency

    Get PDF
    Depth of an object concerns a tradeoff between computation time and excess of program length over the shortest program length required to obtain the object. It gives an unconditional lower bound on the computation time from a given program in absence of auxiliary information. Variants known as logical depth and computational depth are expressed in Kolmogorov complexity theory. We derive quantitative relation between logical depth and computational depth and unify the different depth notions by relating them to A. Kolmogorov and L. Levin’s fruitful notion of randomness deficiency. Subsequently, we revisit the computational depth of infinite strings, study the notion of super deep sequences and relate it with other approaches

    Depth, Highness and DNR Degrees

    Get PDF
    A sequence is Bennett deep [5] if every recursive approximation of the Kolmogorov complexity of its initial segments from above satisfies that the difference between the approximation and the actual value of the Kolmogorov complexity of the initial segments dominates every constant function. We study for different lower bounds r of this difference between approximation and actual value of the initial segment complexity, which properties the corresponding r(n)-deep sets have. We prove that for r(n) = εn, depth coincides with highness on the Turing degrees. For smaller choices of r, i.e., r is any recursive order function, we show that depth implies either highness or diagonally-non-recursiveness (DNR). In particular, for left-r.e. sets, order depth already implies highness. As a corollary, we obtain that weakly-useful sets are either high or DNR. We prove that not all deep sets are high by constructing a low order-deep set. Bennett's depth is defined using prefix-free Kolmogorov complexity. We show that if one replaces prefix-free by plain Kolmogorov complexity in Bennett's depth definition, one obtains a notion which no longer satisfies the slow growth law (which stipulates that no shallow set truth-table computes a deep set); however, under this notion, random sets are not deep (at the unbounded recursive order magnitude). We improve Bennett's result that recursive sets are shallow by proving all K-trivial sets are shallow; our result is close to optimal. For Bennett's depth, the magnitude of compression improvement has to be achieved almost everywhere on the set. Bennett observed that relaxing to infinitely often is meaningless because every recursive set is infinitely often deep. We propose an alternative infinitely often depth notion that doesn't suffer this limitation (called i.o. depth).We show that every hyperimmune degree contains a i.o. deep set of magnitude εn, and construct a π01- class where every member is an i.o. deep set of magnitude εn. We prove that every non-recursive, non-DNR hyperimmune-free set is i.o. deep of constant magnitude, and that every nonrecursive many-one degree contains such a set

    Depth, Highness and DNR Degrees

    Get PDF
    A sequence is Bennett deep [5] if every recursive approximation of the Kolmogorov complexity of its initial segments from above satisfies that the difference between the approximation and the actual value of the Kolmogorov complexity of the initial segments dominates every constant function. We study for different lower bounds r of this difference between approximation and actual value of the initial segment complexity, which properties the corresponding r(n)-deep sets have. We prove that for r(n) = εn, depth coincides with highness on the Turing degrees. For smaller choices of r, i.e., r is any recursive order function, we show that depth implies either highness or diagonally-non-recursiveness (DNR). In particular, for left-r.e. sets, order depth already implies highness. As a corollary, we obtain that weakly-useful sets are either high or DNR. We prove that not all deep sets are high by constructing a low order-deep set. Bennett's depth is defined using prefix-free Kolmogorov complexity. We show that if one replaces prefix-free by plain Kolmogorov complexity in Bennett's depth definition, one obtains a notion which no longer satisfies the slow growth law (which stipulates that no shallow set truth-table computes a deep set); however, under this notion, random sets are not deep (at the unbounded recursive order magnitude). We improve Bennett's result that recursive sets are shallow by proving all K-trivial sets are shallow; our result is close to optimal. For Bennett's depth, the magnitude of compression improvement has to be achieved almost everywhere on the set. Bennett observed that relaxing to infinitely often is meaningless because every recursive set is infinitely often deep. We propose an alternative infinitely often depth notion that doesn't suffer this limitation (called i.o. depth).We show that every hyperimmune degree contains a i.o. deep set of magnitude εn, and construct a π01- class where every member is an i.o. deep set of magnitude εn. We prove that every non-recursive, non-DNR hyperimmune-free set is i.o. deep of constant magnitude, and that every nonrecursive many-one degree contains such a set
    corecore