294 research outputs found

    From Heisenberg to Goedel via Chaitin

    Full text link
    In 1927 Heisenberg discovered that the ``more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa''. Four years later G\"odel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. As both results express some kind of impossibility it is natural to ask whether there is any relation between them, and, indeed, this question has been repeatedly asked for a long time. The main interest seems to have been in possible implications of incompleteness to physics. In this note we will take interest in the {\it converse} implication and will offer a positive answer to the question: Does uncertainty imply incompleteness? We will show that algorithmic randomness is equivalent to a ``formal uncertainty principle'' which implies Chaitin's information-theoretic incompleteness. We also show that the derived uncertainty relation, for many computers, is physical. In fact, the formal uncertainty principle applies to {\it all} systems governed by the wave equation, not just quantum waves. This fact supports the conjecture that uncertainty implies randomness not only in mathematics, but also in physics.Comment: Small change

    Does the universe in fact contain almost no information?

    Get PDF
    At first sight, an accurate description of the state of the universe appears to require a mind-bogglingly large and perhaps even infinite amount of information, even if we restrict our attention to a small subsystem such as a rabbit. In this paper, it is suggested that most of this information is merely apparent, as seen from our subjective viewpoints, and that the algorithmic information content of the universe as a whole is close to zero. It is argued that if the Schr\"odinger equation is universally valid, then decoherence together with the standard chaotic behavior of certain non-linear systems will make the universe appear extremely complex to any self-aware subsets that happen to inhabit it now, even if it was in a quite simple state shortly after the big bang. For instance, gravitational instability would amplify the microscopic primordial density fluctuations that are required by the Heisenberg uncertainty principle into quite macroscopic inhomogeneities, forcing the current wavefunction of the universe to contain such Byzantine superpositions as our planet being in many macroscopically different places at once. Since decoherence bars us from experiencing more than one macroscopic reality, we would see seemingly complex constellations of stars etc, even if the initial wavefunction of the universe was perfectly homogeneous and isotropic.Comment: 17 pages, LATeX, no figures. Online with refs at http://astro.berkeley.edu/~max/nihilo.html (faster from the US), from http://www.mpa-garching.mpg.de/~max/nihilo.html (faster from Europe) or from [email protected]

    On Quantum Effects in a Theory of Biological Evolution

    Get PDF
    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable

    Pseudorandom Number Generators and the Square Site Percolation Threshold

    Full text link
    A select collection of pseudorandom number generators is applied to a Monte Carlo study of the two dimensional square site percolation model. A generator suitable for high precision calculations is identified from an application specific test of randomness. After extended computation and analysis, an ostensibly reliable value of pc = 0.59274598(4) is obtained for the percolation threshold.Comment: 11 pages, 6 figure

    Statistical auditing and randomness test of lotto k/N-type games

    Full text link
    One of the most popular lottery games worldwide is the so-called ``lotto k/N''. It considers N numbers 1,2,...,N from which k are drawn randomly, without replacement. A player selects k or more numbers and the first prize is shared amongst those players whose selected numbers match all of the k randomly drawn. Exact rules may vary in different countries. In this paper, mean values and covariances for the random variables representing the numbers drawn from this kind of game are presented, with the aim of using them to audit statistically the consistency of a given sample of historical results with theoretical values coming from a hypergeometric statistical model. The method can be adapted to test pseudorandom number generators.Comment: 10 pages, no figure

    Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics

    Full text link
    In this philosophical paper, we explore computational and biological analogies to address the fine-tuning problem in cosmology. We first clarify what it means for physical constants or initial conditions to be fine-tuned. We review important distinctions such as the dimensionless and dimensional physical constants, and the classification of constants proposed by Levy-Leblond. Then we explore how two great analogies, computational and biological, can give new insights into our problem. This paper includes a preliminary study to examine the two analogies. Importantly, analogies are both useful and fundamental cognitive tools, but can also be misused or misinterpreted. The idea that our universe might be modelled as a computational entity is analysed, and we discuss the distinction between physical laws and initial conditions using algorithmic information theory. Smolin introduced the theory of "Cosmological Natural Selection" with a biological analogy in mind. We examine an extension of this analogy involving intelligent life. We discuss if and how this extension could be legitimated. Keywords: origin of the universe, fine-tuning, physical constants, initial conditions, computational universe, biological universe, role of intelligent life, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres

    Chaos and quantum-nondemolition measurements

    Get PDF
    The problem of chaotic behavior in quantum mechanics is investigated against the background of the theory of quantum-nondemolition (QND) measurements. The analysis is based on two relevant features: The outcomes of a sequence of QND measurements are unambiguously predictable, and these measurements actually can be performed on one single system without perturbing its time evolution. Consequently, QND measurements represent an appropriate framework to analyze the conditions for the occurrence of ‘‘deterministic randomness’’ in quantum systems. The general arguments are illustrated by a discussion of a quantum system with a time evolution that possesses nonvanishing algorithmic complexity

    Universal fluctuations in subdiffusive transport

    Get PDF
    Subdiffusive transport in tilted washboard potentials is studied within the fractional Fokker-Planck equation approach, using the associated continuous time random walk (CTRW) framework. The scaled subvelocity is shown to obey a universal law, assuming the form of a stationary Levy-stable distribution. The latter is defined by the index of subdiffusion alpha and the mean subvelocity only, but interestingly depends neither on the bias strength nor on the specific form of the potential. These scaled, universal subvelocity fluctuations emerge due to the weak ergodicity breaking and are vanishing in the limit of normal diffusion. The results of the analytical heuristic theory are corroborated by Monte Carlo simulations of the underlying CTRW

    Chaos for Liouville probability densities

    Full text link
    Using the method of symbolic dynamics, we show that a large class of classical chaotic maps exhibit exponential hypersensitivity to perturbation, i.e., a rapid increase with time of the information needed to describe the perturbed time evolution of the Liouville density, the information attaining values that are exponentially larger than the entropy increase that results from averaging over the perturbation. The exponential rate of growth of the ratio of information to entropy is given by the Kolmogorov-Sinai entropy of the map. These findings generalize and extend results obtained for the baker's map [R. Schack and C. M. Caves, Phys. Rev. Lett. 69, 3413 (1992)].Comment: 26 pages in REVTEX, no figures, submitted to Phys. Rev.

    Artificial Sequences and Complexity Measures

    Get PDF
    In this paper we exploit concepts of information theory to address the fundamental problem of identifying and defining the most suitable tools to extract, in a automatic and agnostic way, information from a generic string of characters. We introduce in particular a class of methods which use in a crucial way data compression techniques in order to define a measure of remoteness and distance between pairs of sequences of characters (e.g. texts) based on their relative information content. We also discuss in detail how specific features of data compression techniques could be used to introduce the notion of dictionary of a given sequence and of Artificial Text and we show how these new tools can be used for information extraction purposes. We point out the versatility and generality of our method that applies to any kind of corpora of character strings independently of the type of coding behind them. We consider as a case study linguistic motivated problems and we present results for automatic language recognition, authorship attribution and self consistent-classification.Comment: Revised version, with major changes, of previous "Data Compression approach to Information Extraction and Classification" by A. Baronchelli and V. Loreto. 15 pages; 5 figure
    • 

    corecore