127,974 research outputs found

    Stationary Algorithmic Probability

    Full text link
    Kolmogorov complexity and algorithmic probability are defined only up to an additive resp. multiplicative constant, since their actual values depend on the choice of the universal reference computer. In this paper, we analyze a natural approach to eliminate this machine-dependence. Our method is to assign algorithmic probabilities to the different computers themselves, based on the idea that "unnatural" computers should be hard to emulate. Therefore, we study the Markov process of universal computers randomly emulating each other. The corresponding stationary distribution, if it existed, would give a natural and machine-independent probability measure on the computers, and also on the binary strings. Unfortunately, we show that no stationary distribution exists on the set of all computers; thus, this method cannot eliminate machine-dependence. Moreover, we show that the reason for failure has a clear and interesting physical interpretation, suggesting that every other conceivable attempt to get rid of those additive constants must fail in principle, too. However, we show that restricting to some subclass of computers might help to get rid of some amount of machine-dependence in some situations, and the resulting stationary computer and string probabilities have beautiful properties.Comment: 13 pages, 5 figures. Added an example of a positive recurrent computer se

    A generalized characterization of algorithmic probability

    Get PDF
    An a priori semimeasure (also known as "algorithmic probability" or "the Solomonoff prior" in the context of inductive inference) is defined as the transformation, by a given universal monotone Turing machine, of the uniform measure on the infinite strings. It is shown in this paper that the class of a priori semimeasures can equivalently be defined as the class of transformations, by all compatible universal monotone Turing machines, of any continuous computable measure in place of the uniform measure. Some consideration is given to possible implications for the prevalent association of algorithmic probability with certain foundational statistical principles

    Algorithmic Information Theory and Foundations of Probability

    Full text link
    The use of algorithmic information theory (Kolmogorov complexity theory) to explain the relation between mathematical probability theory and `real world' is discussed

    Coding-theorem Like Behaviour and Emergence of the Universal Distribution from Resource-bounded Algorithmic Probability

    Full text link
    Previously referred to as `miraculous' in the scientific literature because of its powerful properties and its wide application as optimal solution to the problem of induction/inference, (approximations to) Algorithmic Probability (AP) and the associated Universal Distribution are (or should be) of the greatest importance in science. Here we investigate the emergence, the rates of emergence and convergence, and the Coding-theorem like behaviour of AP in Turing-subuniversal models of computation. We investigate empirical distributions of computing models in the Chomsky hierarchy. We introduce measures of algorithmic probability and algorithmic complexity based upon resource-bounded computation, in contrast to previously thoroughly investigated distributions produced from the output distribution of Turing machines. This approach allows for numerical approximations to algorithmic (Kolmogorov-Chaitin) complexity-based estimations at each of the levels of a computational hierarchy. We demonstrate that all these estimations are correlated in rank and that they converge both in rank and values as a function of computational power, despite fundamental differences between computational models. In the context of natural processes that operate below the Turing universal level because of finite resources and physical degradation, the investigation of natural biases stemming from algorithmic rules may shed light on the distribution of outcomes. We show that up to 60\% of the simplicity/complexity bias in distributions produced even by the weakest of the computational models can be accounted for by Algorithmic Probability in its approximation to the Universal Distribution.Comment: 27 pages main text, 39 pages including supplement. Online complexity calculator: http://complexitycalculator.com

    The Thermodynamics of Network Coding, and an Algorithmic Refinement of the Principle of Maximum Entropy

    Full text link
    The principle of maximum entropy (Maxent) is often used to obtain prior probability distributions as a method to obtain a Gibbs measure under some restriction giving the probability that a system will be in a certain state compared to the rest of the elements in the distribution. Because classical entropy-based Maxent collapses cases confounding all distinct degrees of randomness and pseudo-randomness, here we take into consideration the generative mechanism of the systems considered in the ensemble to separate objects that may comply with the principle under some restriction and whose entropy is maximal but may be generated recursively from those that are actually algorithmically random offering a refinement to classical Maxent. We take advantage of a causal algorithmic calculus to derive a thermodynamic-like result based on how difficult it is to reprogram a computer code. Using the distinction between computable and algorithmic randomness we quantify the cost in information loss associated with reprogramming. To illustrate this we apply the algorithmic refinement to Maxent on graphs and introduce a Maximal Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a generalisation over previous approaches. We discuss practical implications of evaluation of network randomness. Our analysis provides insight in that the reprogrammability asymmetry appears to originate from a non-monotonic relationship to algorithmic probability. Our analysis motivates further analysis of the origin and consequences of the aforementioned asymmetries, reprogrammability, and computation.Comment: 30 page

    Quantum Kolmogorov Complexity and Quantum Key Distribution

    Full text link
    We discuss the Bennett-Brassard 1984 (BB84) quantum key distribution protocol in the light of quantum algorithmic information. While Shannon's information theory needs a probability to define a notion of information, algorithmic information theory does not need it and can assign a notion of information to an individual object. The program length necessary to describe an object, Kolmogorov complexity, plays the most fundamental role in the theory. In the context of algorithmic information theory, we formulate a security criterion for the quantum key distribution by using the quantum Kolmogorov complexity that was recently defined by Vit\'anyi. We show that a simple BB84 protocol indeed distribute a binary sequence between Alice and Bob that looks almost random for Eve with a probability exponentially close to 1.Comment: typos correcte
    • …
    corecore