58,813 research outputs found

    Algorithmic Randomness

    Get PDF
    We consider algorithmic randomness in the Cantor space C of the infinite binary sequences. By an algorithmic randomness concept one specifies a set of elements of C, each of which is assigned the property of being random. Miscellaneous notions from computability theory are used in the definitions of randomness concepts that are essentially rooted in the following three intuitive randomness requirements: the initial segments of a random sequence should be effectively incompressible, no random sequence should be an element of an effective measure null set containing sequences with an “exceptional property”, and finally, considering betting games, in which the bits of a sequence are guessed successively, there should be no effective betting strategy that helps a player win an unbounded amount of capital on a random sequence. For various formalizations of these requirements one uses versions of Kolmogorov complexity, of tests, and of martingales, respectively. In case any of these notions is used in the definition of a randomness concept, one may ask in general for fundamental equivalent definitions in terms of the respective other two notions. This was a long-standing open question w.r.t. computable randomness, a central concept that had been introduced by Schnorr via martingales. In this thesis, we introduce bounded tests that we use to give a characterization of computable randomness in terms of tests. Our result was obtained independently of the prior test characterization of computable randomness due to Downey, Griffiths, and LaForte, who defined graded tests for their result. Based on bounded tests, we define bounded machines which give rise to a version of Kolmogorov complexity that we use to prove another characterization of computable randomness. This result, as in analog situations, allows for the introduction of interesting lowness and triviality properties that are, roughly speaking, “anti-randomness” properties. We define and study the notions lowness for bounded machines and bounded triviality. Using a theorem due to Nies, it can be shown that only the computable sequences are low for bounded machines. Further we show some interesting properties of bounded machines, and we demonstrate that every boundedly trivial sequence is K-trivial. Furthermore we define lowness for computable machines, a lowness notion in the setting of Schnorr randomness. We prove that a sequence is low for computable machines if and only if it is computably traceable. Gacs and independently Kucera proved a central theorem which states that every sequence is effectively decodable from a suitable Martin-Löf random sequence. We present a somewhat easier proof of this theorem, where we construct a sequence with the required property by diagonalizing against appropriate martingales. By a variant of that construction we prove that there exists a computably random sequence that is weak truth-table autoreducible. Further, we show that a sequence is computably enumerable self-reducible if and only if its associated real is computably enumerable. Finally we investigate interrelations between the Lebesgue measure and effective measures on C. We prove the following extension of a result due to Book, Lutz, and Wagner: A union of Pi-0-1 classes that is closed under finite variations has Lebesgue measure zero if and only if it contains no Kurtz random real. However we demonstrate that even a Sigma-0-2 class with Lebesgue measure zero need not be a Kurtz null class. Turning to Almost classes, we show among other things that every Almost class with respect to a bounded reducibility has computable packing dimension zero

    The Dimensions of Individual Strings and Sequences

    Get PDF
    A constructive version of Hausdorff dimension is developed using constructive supergales, which are betting strategies that generalize the constructive supermartingales used in the theory of individual random sequences. This constructive dimension is used to assign every individual (infinite, binary) sequence S a dimension, which is a real number dim(S) in the interval [0,1]. Sequences that are random (in the sense of Martin-Lof) have dimension 1, while sequences that are decidable, \Sigma^0_1, or \Pi^0_1 have dimension 0. It is shown that for every \Delta^0_2-computable real number \alpha in [0,1] there is a \Delta^0_2 sequence S such that \dim(S) = \alpha. A discrete version of constructive dimension is also developed using termgales, which are supergale-like functions that bet on the terminations of (finite, binary) strings as well as on their successive bits. This discrete dimension is used to assign each individual string w a dimension, which is a nonnegative real number dim(w). The dimension of a sequence is shown to be the limit infimum of the dimensions of its prefixes. The Kolmogorov complexity of a string is proven to be the product of its length and its dimension. This gives a new characterization of algorithmic information and a new proof of Mayordomo's recent theorem stating that the dimension of a sequence is the limit infimum of the average Kolmogorov complexity of its first n bits. Every sequence that is random relative to any computable sequence of coin-toss biases that converge to a real number \beta in (0,1) is shown to have dimension \H(\beta), the binary entropy of \beta.Comment: 31 page

    Sub-computable Boundedness Randomness

    Full text link
    This paper defines a new notion of bounded computable randomness for certain classes of sub-computable functions which lack a universal machine. In particular, we define such versions of randomness for primitive recursive functions and for PSPACE functions. These new notions are robust in that there are equivalent formulations in terms of (1) Martin-L\"of tests, (2) Kolmogorov complexity, and (3) martingales. We show these notions can be equivalently defined with prefix-free Kolmogorov complexity. We prove that one direction of van Lambalgen's theorem holds for relative computability, but the other direction fails. We discuss statistical properties of these notions of randomness

    Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity

    Get PDF
    The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic condition under which the ideal principle should be applied is encapsulated as the Fundamental Inequality, which in broad terms states that the principle is valid when the data are random, relative to every contemplated hypothesis and also these hypotheses are random relative to the (universal) prior. Basically, the ideal principle states that the prior probability associated with the hypothesis should be given by the algorithmic universal probability, and the sum of the log universal probability of the model plus the log of the probability of the data given the model should be minimized. If we restrict the model class to the finite sets then application of the ideal principle turns into Kolmogorov's minimal sufficient statistic. In general we show that data compression is almost always the best strategy, both in hypothesis identification and prediction.Comment: 35 pages, Latex. Submitted IEEE Trans. Inform. Theor

    Fourier spectra of measures associated with algorithmically random Brownian motion

    Full text link
    In this paper we study the behaviour at infinity of the Fourier transform of Radon measures supported by the images of fractal sets under an algorithmically random Brownian motion. We show that, under some computability conditions on these sets, the Fourier transform of the associated measures have, relative to the Hausdorff dimensions of these sets, optimal asymptotic decay at infinity. The argument relies heavily on a direct characterisation, due to Asarin and Pokrovskii, of algorithmically random Brownian motion in terms of the prefix free Kolmogorov complexity of finite binary sequences. The study also necessitates a closer look at the potential theory over fractals from a computable point of view.Comment: 24 page

    Shannon Information and Kolmogorov Complexity

    Full text link
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual information versus Kolmogorov (`algorithmic') mutual information, probabilistic sufficient statistic versus algorithmic sufficient statistic (related to lossy compression in the Shannon theory versus meaningful information in the Kolmogorov theory), and rate distortion theory versus Kolmogorov's structure function. Part of the material has appeared in print before, scattered through various publications, but this is the first comprehensive systematic comparison. The last mentioned relations are new.Comment: Survey, LaTeX 54 pages, 3 figures, Submitted to IEEE Trans Information Theor

    Measuring sets in infinite groups

    Full text link
    We are now witnessing a rapid growth of a new part of group theory which has become known as "statistical group theory". A typical result in this area would say something like ``a random element (or a tuple of elements) of a group G has a property P with probability p". The validity of a statement like that does, of course, heavily depend on how one defines probability on groups, or, equivalently, how one measures sets in a group (in particular, in a free group). We hope that new approaches to defining probabilities on groups outlined in this paper create, among other things, an appropriate framework for the study of the "average case" complexity of algorithms on groups.Comment: 22 page

    Effective Complexity and its Relation to Logical Depth

    Full text link
    Effective complexity measures the information content of the regularities of an object. It has been introduced by M. Gell-Mann and S. Lloyd to avoid some of the disadvantages of Kolmogorov complexity, also known as algorithmic information content. In this paper, we give a precise formal definition of effective complexity and rigorous proofs of its basic properties. In particular, we show that incompressible binary strings are effectively simple, and we prove the existence of strings that have effective complexity close to their lengths. Furthermore, we show that effective complexity is related to Bennett's logical depth: If the effective complexity of a string xx exceeds a certain explicit threshold then that string must have astronomically large depth; otherwise, the depth can be arbitrarily small.Comment: 14 pages, 2 figure
    • …
    corecore