8 research outputs found
Predictions and algorithmic statistics for infinite sequence
Consider the following prediction problem. Assume that there is a block box
that produces bits according to some unknown computable distribution on the
binary tree. We know first bits . We want to know the
probability of the event that that the next bit is equal to . Solomonoff
suggested to use universal semimeasure for solving this task. He proved
that for every computable distribution and for every the
following holds: However, Solomonoff's method has a negative aspect: Hutter
and Muchnik proved that there are an universal semimeasure , computable
distribution and a random (in Martin-L{\"o}f sense) sequence such that . We suggest a new way for
prediction. For every finite string we predict the new bit according to the
best (in some sence) distribution for . We prove the similar result as
Solomonoff theorem for our way of prediction. Also we show that our method of
prediction has no that negative aspect as Solomonoff's method.Comment: 12 page
On Martin-Löf convergence of Solomonoff’s mixture
We study the convergence of Solomonoff’s universal mixture on individual Martin-Löf random sequences. A new result is presented extending the work of Hutter and Muchnik (2004) by showing that there does not exist a universal mixture that converges on all Martin-Löf random sequences
Algorithmic Complexity Bounds on Future Prediction Errors
We bound the future loss when predicting any (computably) stochastic sequence
online. Solomonoff finitely bounded the total deviation of his universal
predictor from the true distribution by the algorithmic complexity of
. Here we assume we are at a time and already observed .
We bound the future prediction performance on by a new
variant of algorithmic complexity of given , plus the complexity of the
randomness deficiency of . The new complexity is monotone in its condition
in the sense that this complexity can only decrease if the condition is
prolonged. We also briefly discuss potential generalizations to Bayesian model
classes and to classification problems.Comment: 21 page
On Generalized Computable Universal Priors and their Convergence
Solomonoff unified Occam's razor and Epicurus' principle of multiple
explanations to one elegant, formal, universal theory of inductive inference,
which initiated the field of algorithmic information theory. His central result
is that the posterior of the universal semimeasure M converges rapidly to the
true sequence generating posterior mu, if the latter is computable. Hence, M is
eligible as a universal predictor in case of unknown mu. The first part of the
paper investigates the existence and convergence of computable universal
(semi)measures for a hierarchy of computability classes: recursive, estimable,
enumerable, and approximable. For instance, M is known to be enumerable, but
not estimable, and to dominate all enumerable semimeasures. We present proofs
for discrete and continuous semimeasures. The second part investigates more
closely the types of convergence, possibly implied by universality: in
difference and in ratio, with probability 1, in mean sum, and for Martin-Loef
random sequences. We introduce a generalized concept of randomness for
individual sequences and use it to exhibit difficulties regarding these issues.
In particular, we show that convergence fails (holds) on generalized-random
sequences in gappy (dense) Bernoulli classes.Comment: 22 page
On Universal Prediction and Bayesian Confirmation
The Bayesian framework is a well-studied and successful framework for
inductive reasoning, which includes hypothesis testing and confirmation,
parameter estimation, sequence prediction, classification, and regression. But
standard statistical guidelines for choosing the model class and prior are not
always available or fail, in particular in complex situations. Solomonoff
completed the Bayesian framework by providing a rigorous, unique, formal, and
universal choice for the model class and the prior. We discuss in breadth how
and in which sense universal (non-i.i.d.) sequence prediction solves various
(philosophical) problems of traditional Bayesian sequence prediction. We show
that Solomonoff's model possesses many desirable properties: Strong total and
weak instantaneous bounds, and in contrast to most classical continuous prior
densities has no zero p(oste)rior problem, i.e. can confirm universal
hypotheses, is reparametrization and regrouping invariant, and avoids the
old-evidence and updating problem. It even performs well (actually better) in
non-computable environments.Comment: 24 page
Universal Convergence of Semimeasures on Individual Random Sequences
Solomonoff's central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior μ, if the latter is computable. Hence, M is eligible as a universal sequence p
Universal convergence of semimeasures on individual random sequences, in
Solomonoff’s central result on induction is that the posterior of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating posterior µ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in case of unknown µ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Löf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge for all random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly-measures. We show that W converges to D and D to µ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role