115,947 research outputs found
Around Kolmogorov complexity: basic notions and results
Algorithmic information theory studies description complexity and randomness
and is now a well known field of theoretical computer science and mathematical
logic. There are several textbooks and monographs devoted to this theory where
one can find the detailed exposition of many difficult results as well as
historical references. However, it seems that a short survey of its basic
notions and main results relating these notions to each other, is missing.
This report attempts to fill this gap and covers the basic notions of
algorithmic information theory: Kolmogorov complexity (plain, conditional,
prefix), Solomonoff universal a priori probability, notions of randomness
(Martin-L\"of randomness, Mises--Church randomness), effective Hausdorff
dimension. We prove their basic properties (symmetry of information, connection
between a priori probability and prefix complexity, criterion of randomness in
terms of complexity, complexity characterization for effective dimension) and
show some applications (incompressibility method in computational complexity
theory, incompleteness theorems). It is based on the lecture notes of a course
at Uppsala University given by the author
The interplay of classes of algorithmically random objects
We study algorithmically random closed subsets of , algorithmically
random continuous functions from to , and algorithmically
random Borel probability measures on , especially the interplay
between these three classes of objects. Our main tools are preservation of
randomness and its converse, the no randomness ex nihilo principle, which say
together that given an almost-everywhere defined computable map between an
effectively compact probability space and an effective Polish space, a real is
Martin-L\"of random for the pushforward measure if and only if its preimage is
random with respect to the measure on the domain. These tools allow us to prove
new facts, some of which answer previously open questions, and reprove some
known results more simply.
Our main results are the following. First we answer an open question of
Barmapalias, Brodhead, Cenzer, Remmel, and Weber by showing that
is a random closed set if and only if it is the
set of zeros of a random continuous function on . As a corollary we
obtain the result that the collection of random continuous functions on
is not closed under composition. Next, we construct a computable
measure on the space of measures on such that
is a random closed set if and only if
is the support of a -random measure. We also establish a
correspondence between random closed sets and the random measures studied by
Culver in previous work. Lastly, we study the ranges of random continuous
functions, showing that the Lebesgue measure of the range of a random
continuous function is always contained in
Randomness and differentiability in higher dimensions
We present two theorems concerned with algorithmic randomness and
differentiability of functions of several variables. Firstly, we prove an
effective form of the Rademacher's Theorem: we show that computable randomness
implies differentiability of computable Lipschitz functions of several
variables. Secondly, we show that weak 2-randomness is equivalent to
differentiability of computable a.e. differentiable functions of several
variables.Comment: 19 page
- β¦