70 research outputs found
The process complexity and effective random tests
We propose a variant of the Kolmogorov concept of complexity which yields a common theory of finite and infinite random sequences. The process complexity does not oscillate. We establish some concepts of effective tests which are proved to be equivalent
The dimension of ergodic random sequences
Let \mu be a computable ergodic shift-invariant measure over the Cantor
space. Providing a constructive proof of Shannon-McMillan-Breiman theorem,
V'yugin proved that if a sequence x is Martin-L\"of random w.r.t. \mu then the
strong effective dimension Dim(x) of x equals the entropy of \mu. Whether its
effective dimension dim(x) also equals the entropy was left as an problem
question. In this paper we settle this problem, providing a positive answer. A
key step in the proof consists in extending recent results on Birkhoff's
ergodic theorem for Martin-L\"of random sequences
Algorithmic Randomness as Foundation of Inductive Reasoning and Artificial Intelligence
This article is a brief personal account of the past, present, and future of
algorithmic randomness, emphasizing its role in inductive inference and
artificial intelligence. It is written for a general audience interested in
science and philosophy. Intuitively, randomness is a lack of order or
predictability. If randomness is the opposite of determinism, then algorithmic
randomness is the opposite of computability. Besides many other things, these
concepts have been used to quantify Ockham's razor, solve the induction
problem, and define intelligence.Comment: 9 LaTeX page
Randomness on computable probability spaces - A dynamical point of view
We extend the notion of randomness (in the version introduced by Schnorr) to computable probability spaces and compare it to a dynamical notion of randomness: typicality. Roughly, a point is typical for some dynamic, if it follows the statistical behavior of the system (Birkhoff’s pointwise ergodic theorem). We prove that a point is Schnorr random if and only if it is typical for every mixing computable dynamics. To prove the result we develop some tools for the theory of computable probability spaces (for example, morphisms) that are expected to have other applications
A generalized characterization of algorithmic probability
An a priori semimeasure (also known as "algorithmic probability" or "the
Solomonoff prior" in the context of inductive inference) is defined as the
transformation, by a given universal monotone Turing machine, of the uniform
measure on the infinite strings. It is shown in this paper that the class of a
priori semimeasures can equivalently be defined as the class of
transformations, by all compatible universal monotone Turing machines, of any
continuous computable measure in place of the uniform measure. Some
consideration is given to possible implications for the prevalent association
of algorithmic probability with certain foundational statistical principles
- …