7,703 research outputs found
Around Kolmogorov complexity: basic notions and results
Algorithmic information theory studies description complexity and randomness
and is now a well known field of theoretical computer science and mathematical
logic. There are several textbooks and monographs devoted to this theory where
one can find the detailed exposition of many difficult results as well as
historical references. However, it seems that a short survey of its basic
notions and main results relating these notions to each other, is missing.
This report attempts to fill this gap and covers the basic notions of
algorithmic information theory: Kolmogorov complexity (plain, conditional,
prefix), Solomonoff universal a priori probability, notions of randomness
(Martin-L\"of randomness, Mises--Church randomness), effective Hausdorff
dimension. We prove their basic properties (symmetry of information, connection
between a priori probability and prefix complexity, criterion of randomness in
terms of complexity, complexity characterization for effective dimension) and
show some applications (incompressibility method in computational complexity
theory, incompleteness theorems). It is based on the lecture notes of a course
at Uppsala University given by the author
On Resource-bounded versions of the van Lambalgen theorem
The van Lambalgen theorem is a surprising result in algorithmic information
theory concerning the symmetry of relative randomness. It establishes that for
any pair of infinite sequences and , is Martin-L\"of random and
is Martin-L\"of random relative to if and only if the interleaved sequence
is Martin-L\"of random. This implies that is relative random
to if and only if is random relative to \cite{vanLambalgen},
\cite{Nies09}, \cite{HirschfeldtBook}. This paper studies the validity of this
phenomenon for different notions of time-bounded relative randomness.
We prove the classical van Lambalgen theorem using martingales and Kolmogorov
compressibility. We establish the failure of relative randomness in these
settings, for both time-bounded martingales and time-bounded Kolmogorov
complexity. We adapt our classical proofs when applicable to the time-bounded
setting, and construct counterexamples when they fail. The mode of failure of
the theorem may depend on the notion of time-bounded randomness
Kolmogorov Complexity in perspective. Part I: Information Theory and Randomnes
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts in the same volume. Part I is dedicated to information theory and
the mathematical formalization of randomness based on Kolmogorov complexity.
This last application goes back to the 60's and 70's with the work of
Martin-L\"of, Schnorr, Chaitin, Levin, and has gained new impetus in the last
years.Comment: 40 page
Return times, recurrence densities and entropy for actions of some discrete amenable groups
Results of Wyner and Ziv and of Ornstein and Weiss show that if one observes
the first k outputs of a finite-valued ergodic process, then the waiting time
until this block appears again is almost surely asymptotic to , where
is the entropy of the process. We examine this phenomenon when the allowed
return times are restricted to some subset of times, and generalize the results
to processes parameterized by other discrete amenable groups.
We also obtain a uniform density version of the waiting time results: For a
process on symbols, within a given realization, the density of the initial
-block within larger -blocks approaches , uniformly in ,
as tends to infinity. Again, similar results hold for processes with other
indexing groups.Comment: To appear in Journal d'Analyse Mathematiqu
Quantum randomness and value indefiniteness
As computability implies value definiteness, certain sequences of quantum
outcomes cannot be computable.Comment: 13 pages, revise
Pushdown Compression
The pressing need for eficient compression schemes for XML documents has
recently been focused on stack computation [6, 9], and in particular calls for
a formulation of information-lossless stack or pushdown compressors that allows
a formal analysis of their performance and a more ambitious use of the stack in
XML compression, where so far it is mainly connected to parsing mechanisms. In
this paper we introduce the model of pushdown compressor, based on pushdown
transducers that compute a single injective function while keeping the widest
generality regarding stack computation. The celebrated Lempel-Ziv algorithm
LZ78 [10] was introduced as a general purpose compression algorithm that
outperforms finite-state compressors on all sequences. We compare the
performance of the Lempel-Ziv algorithm with that of the pushdown compressors,
or compression algorithms that can be implemented with a pushdown transducer.
This comparison is made without any a priori assumption on the data's source
and considering the asymptotic compression ratio for infinite sequences. We
prove that Lempel-Ziv is incomparable with pushdown compressors
- …