2 research outputs found

    Compression is Comprehension, and the Unreasonable Effectiveness of Digital Computation in the Natural World

    Full text link
    Chaitin's work, in its depth and breadth, encompasses many areas of scientific and philosophical interest. It helped establish the accepted mathematical concept of randomness, which in turn is the basis of tools that I have developed to justify and quantify what I think is clear evidence of the algorithmic nature of the world. To illustrate the concept I will establish novel upper bounds of algorithmic randomness for elementary cellular automata. I will discuss how the practice of science consists in conceiving a model that starts from certain initial values, running a computable instantiation, and awaiting a result in order to determine where the system may be in a future state--in a shorter time than the time taken by the actual unfolding of the phenomenon in question. If a model does not comply with all or some of these requirements it is traditionally considered useless or even unscientific, so the more precise and faster the better. A model is thus better if it can explain more with less, which is at the core of Chaitin's "compression is comprehension". I will pursue these questions related to the random versus possibly algorithmic nature of the world in two directions, drawing heavily on the work of Chaitin. I will also discuss how the algorithmic approach is related to the success of science at producing models of the world, allowing computer simulations to better understand it and make more accurate predictions and interventions.Comment: 30 pages. Invited contribution to Chaitin's festschrift based on an invited talk delivered at the Workshop on 'Patterns in the World', Department of Philosophy, University of Barcelona on December 14, 201

    Estimations of Integrated Information Based on Algorithmic Complexity and Dynamic Querying

    Full text link
    The concept of information has emerged as a language in its own right, bridging several disciplines that analyze natural phenomena and man-made systems. Integrated information has been introduced as a metric to quantify the amount of information generated by a system beyond the information generated by its elements. Yet, this intriguing notion comes with the price of being prohibitively expensive to calculate, since the calculations require an exponential number of sub-divisions of a system. Here we introduce a novel framework to connect algorithmic randomness and integrated information and a numerical method for estimating integrated information using a perturbation test rooted in algorithmic information dynamics. This method quantifies the change in program size of a system when subjected to a perturbation. The intuition behind is that if an object is random then random perturbations have little to no effect to what happens when a shorter program but when an object has the ability to move in both directions (towards or away from randomness) it will be shown to be better integrated as a measure of sophistication telling apart randomness and simplicity from structure. We show that an object with a high integrated information value is also more compressible, and is, therefore, more sensitive to perturbations. We find that such a perturbation test quantifying compression sensitivity provides a system with a means to extract explanations--causal accounts--of its own behaviour. Our technique can reduce the number of calculations to arrive at some bounds or estimations, as the algorithmic perturbation test guides an efficient search for estimating integrated information. Our work sets the stage for a systematic exploration of connections between algorithmic complexity and integrated information at the level of both theory and practice.Comment: 33 pages + Appendix = 44 page
    corecore