29,174 research outputs found

    Stationary Algorithmic Probability

    Full text link
    Kolmogorov complexity and algorithmic probability are defined only up to an additive resp. multiplicative constant, since their actual values depend on the choice of the universal reference computer. In this paper, we analyze a natural approach to eliminate this machine-dependence. Our method is to assign algorithmic probabilities to the different computers themselves, based on the idea that "unnatural" computers should be hard to emulate. Therefore, we study the Markov process of universal computers randomly emulating each other. The corresponding stationary distribution, if it existed, would give a natural and machine-independent probability measure on the computers, and also on the binary strings. Unfortunately, we show that no stationary distribution exists on the set of all computers; thus, this method cannot eliminate machine-dependence. Moreover, we show that the reason for failure has a clear and interesting physical interpretation, suggesting that every other conceivable attempt to get rid of those additive constants must fail in principle, too. However, we show that restricting to some subclass of computers might help to get rid of some amount of machine-dependence in some situations, and the resulting stationary computer and string probabilities have beautiful properties.Comment: 13 pages, 5 figures. Added an example of a positive recurrent computer se

    Finite time ruin probabilities with one Laplace inversion.

    Get PDF
    In this work we present an explicit formula for the Laplace transform in time of the finite time ruin probabilities of a classical Levy model with phase-type claims. Our result generalizes the ultimate ruin probability formula of Asmussen and Rolski [IME 10 (1991) 259]ā€”see also the analog queuing formula for the stationary waiting time of the M/Ph/1 queue in Neuts [Matrix-geometric Solutions in Stochastic Models: An Algorithmic Approach. Johns Hopkins University Press, Baltimore, MD, 1981]ā€”and it considers the deficit at ruin as wellFinite-time ruin probability; Phase-type distribution; Deficit at ruin; Lundbergā€™s equation; Laplace transform;

    Effective complexity of stationary process realizations

    Full text link
    The concept of effective complexity of an object as the minimal description length of its regularities has been initiated by Gell-Mann and Lloyd. The regularities are modeled by means of ensembles, that is probability distributions on finite binary strings. In our previous paper we propose a definition of effective complexity in precise terms of algorithmic information theory. Here we investigate the effective complexity of binary strings generated by stationary, in general not computable, processes. We show that under not too strong conditions long typical process realizations are effectively simple. Our results become most transparent in the context of coarse effective complexity which is a modification of the original notion of effective complexity that uses less parameters in its definition. A similar modification of the related concept of sophistication has been suggested by Antunes and Fortnow.Comment: 14 pages, no figure

    Causal inference using the algorithmic Markov condition

    Full text link
    Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when only single observations are present. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information and describe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also discuss how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution.Comment: 16 figure
    • ā€¦
    corecore