265 research outputs found

    On Time-Bounded Incompressibility of Compressible Strings and Sequences

    Full text link
    For every total recursive time bound tt, a constant fraction of all compressible (low Kolmogorov complexity) strings is tt-bounded incompressible (high time-bounded Kolmogorov complexity); there are uncountably many infinite sequences of which every initial segment of length nn is compressible to log⁡n\log n yet tt-bounded incompressible below 1/4n−log⁡n{1/4}n - \log n; and there are countable infinitely many recursive infinite sequence of which every initial segment is similarly tt-bounded incompressible. These results are related to, but different from, Barzdins's lemma.Comment: 9 pages, LaTeX, no figures, submitted to Information Processing Letters. Changed and added a Barzdins-like lemma for infinite sequences with different quantification oreder, a fixed constant, and uncountably many sequence

    On Resource-bounded versions of the van Lambalgen theorem

    Full text link
    The van Lambalgen theorem is a surprising result in algorithmic information theory concerning the symmetry of relative randomness. It establishes that for any pair of infinite sequences AA and BB, BB is Martin-L\"of random and AA is Martin-L\"of random relative to BB if and only if the interleaved sequence A⊎BA \uplus B is Martin-L\"of random. This implies that AA is relative random to BB if and only if BB is random relative to AA \cite{vanLambalgen}, \cite{Nies09}, \cite{HirschfeldtBook}. This paper studies the validity of this phenomenon for different notions of time-bounded relative randomness. We prove the classical van Lambalgen theorem using martingales and Kolmogorov compressibility. We establish the failure of relative randomness in these settings, for both time-bounded martingales and time-bounded Kolmogorov complexity. We adapt our classical proofs when applicable to the time-bounded setting, and construct counterexamples when they fail. The mode of failure of the theorem may depend on the notion of time-bounded randomness

    Around Kolmogorov complexity: basic notions and results

    Full text link
    Algorithmic information theory studies description complexity and randomness and is now a well known field of theoretical computer science and mathematical logic. There are several textbooks and monographs devoted to this theory where one can find the detailed exposition of many difficult results as well as historical references. However, it seems that a short survey of its basic notions and main results relating these notions to each other, is missing. This report attempts to fill this gap and covers the basic notions of algorithmic information theory: Kolmogorov complexity (plain, conditional, prefix), Solomonoff universal a priori probability, notions of randomness (Martin-L\"of randomness, Mises--Church randomness), effective Hausdorff dimension. We prove their basic properties (symmetry of information, connection between a priori probability and prefix complexity, criterion of randomness in terms of complexity, complexity characterization for effective dimension) and show some applications (incompressibility method in computational complexity theory, incompleteness theorems). It is based on the lecture notes of a course at Uppsala University given by the author

    Anytime Algorithms for Non-Ending Computations

    Get PDF
    A program which eventually stops but does not halt “too quickly” halts at a time which is algorithmically compressible. This result — originally proved in [4] — is proved in a more general setting. Following Manin [11] we convert the result into an anytime algorithm for the halting problem and we show that the stopping time (cut-off temporal bound) cannot be significantly improved

    Causality - Complexity - Consistency: Can Space-Time Be Based on Logic and Computation?

    Full text link
    The difficulty of explaining non-local correlations in a fixed causal structure sheds new light on the old debate on whether space and time are to be seen as fundamental. Refraining from assuming space-time as given a priori has a number of consequences. First, the usual definitions of randomness depend on a causal structure and turn meaningless. So motivated, we propose an intrinsic, physically motivated measure for the randomness of a string of bits: its length minus its normalized work value, a quantity we closely relate to its Kolmogorov complexity (the length of the shortest program making a universal Turing machine output this string). We test this alternative concept of randomness for the example of non-local correlations, and we end up with a reasoning that leads to similar conclusions as in, but is conceptually more direct than, the probabilistic view since only the outcomes of measurements that can actually all be carried out together are put into relation to each other. In the same context-free spirit, we connect the logical reversibility of an evolution to the second law of thermodynamics and the arrow of time. Refining this, we end up with a speculation on the emergence of a space-time structure on bit strings in terms of data-compressibility relations. Finally, we show that logical consistency, by which we replace the abandoned causality, it strictly weaker a constraint than the latter in the multi-party case.Comment: 17 pages, 16 figures, small correction
    • 

    corecore