103 research outputs found

    A Computable Measure of Algorithmic Probability by Finite Approximations with an Application to Integer Sequences

    Get PDF
    Given the widespread use of lossless compression algorithms to approximate algorithmic (Kolmogorov-Chaitin) complexity, and that lossless compression algorithms fall short at characterizing patterns other than statistical ones not different to entropy estimations, here we explore an alternative and complementary approach. We study formal properties of a Levin-inspired measure mm calculated from the output distribution of small Turing machines. We introduce and justify finite approximations mkm_k that have been used in some applications as an alternative to lossless compression algorithms for approximating algorithmic (Kolmogorov-Chaitin) complexity. We provide proofs of the relevant properties of both mm and mkm_k and compare them to Levin's Universal Distribution. We provide error estimations of mkm_k with respect to mm. Finally, we present an application to integer sequences from the Online Encyclopedia of Integer Sequences which suggests that our AP-based measures may characterize non-statistical patterns, and we report interesting correlations with textual, function and program description lengths of the said sequences.Comment: As accepted by the journal Complexity (Wiley/Hindawi

    Coding-theorem Like Behaviour and Emergence of the Universal Distribution from Resource-bounded Algorithmic Probability

    Full text link
    Previously referred to as `miraculous' in the scientific literature because of its powerful properties and its wide application as optimal solution to the problem of induction/inference, (approximations to) Algorithmic Probability (AP) and the associated Universal Distribution are (or should be) of the greatest importance in science. Here we investigate the emergence, the rates of emergence and convergence, and the Coding-theorem like behaviour of AP in Turing-subuniversal models of computation. We investigate empirical distributions of computing models in the Chomsky hierarchy. We introduce measures of algorithmic probability and algorithmic complexity based upon resource-bounded computation, in contrast to previously thoroughly investigated distributions produced from the output distribution of Turing machines. This approach allows for numerical approximations to algorithmic (Kolmogorov-Chaitin) complexity-based estimations at each of the levels of a computational hierarchy. We demonstrate that all these estimations are correlated in rank and that they converge both in rank and values as a function of computational power, despite fundamental differences between computational models. In the context of natural processes that operate below the Turing universal level because of finite resources and physical degradation, the investigation of natural biases stemming from algorithmic rules may shed light on the distribution of outcomes. We show that up to 60\% of the simplicity/complexity bias in distributions produced even by the weakest of the computational models can be accounted for by Algorithmic Probability in its approximation to the Universal Distribution.Comment: 27 pages main text, 39 pages including supplement. Online complexity calculator: http://complexitycalculator.com

    Algorithmic Complexity for Short Binary Strings Applied to Psychology: A Primer

    Full text link
    Since human randomness production has been studied and widely used to assess executive functions (especially inhibition), many measures have been suggested to assess the degree to which a sequence is random-like. However, each of them focuses on one feature of randomness, leading authors to have to use multiple measures. Here we describe and advocate for the use of the accepted universal measure for randomness based on algorithmic complexity, by means of a novel previously presented technique using the the definition of algorithmic probability. A re-analysis of the classical Radio Zenith data in the light of the proposed measure and methodology is provided as a study case of an application.Comment: To appear in Behavior Research Method

    A Stochastic Complexity Perspective of Induction in Economics and Inference in Dynamics

    Get PDF
    Rissanen's fertile and pioneering minimum description length principle (MDL) has been viewed from the point of view of statistical estimation theory, information theory, as stochastic complexity theory -.i.e., a computable approximation to Kolomogorov Complexity - or Solomonoff's recursion theoretic induction principle or as analogous to Kolmogorov's sufficient statistics. All these - and many more - interpretations are valid, interesting and fertile. In this paper I view it from two points of view: those of an algorithmic economist and a dynamical system theorist. >From these points of view I suggest, first, a recasting of Jevons's sceptical vision of induction in the light of MDL; and a complexity interpretation of an undecidable question in dynamics.

    A General Notion of Useful Information

    Full text link
    In this paper we introduce a general framework for defining the depth of a sequence with respect to a class of observers. We show that our general framework captures all depth notions introduced in complexity theory so far. We review most such notions, show how they are particular cases of our general depth framework, and review some classical results about the different depth notions

    A Primer on the Tools and Concepts of Computable Economics

    Get PDF
    Computability theory came into being as a result of Hilbert's attempts to meet Brouwer's challenges, from an intuitionistc and constructive standpoint, to formalism as a foundation for mathematical practice. Viewed this way, constructive mathematics should be one vision of computability theory. However, there are fundamental differences between computability theory and constructive mathematics: the Church-Turing thesis is a disciplining criterion in the former and not in the latter; and classical logic - particularly, the law of the excluded middle - is not accepted in the latter but freely invoked in the former, especially in proving universal negative propositions. In Computable Economic an eclectic approach is adopted where the main criterion is numerical content for economic entities. In this sense both the computable and the constructive traditions are freely and indiscriminately invoked and utilised in the formalization of economic entities. Some of the mathematical methods and concepts of computable economics are surveyed in a pedagogical mode. The context is that of a digital economy embedded in an information society
    corecore