4,656 research outputs found

    Effective Complexity and its Relation to Logical Depth

    Full text link
    Effective complexity measures the information content of the regularities of an object. It has been introduced by M. Gell-Mann and S. Lloyd to avoid some of the disadvantages of Kolmogorov complexity, also known as algorithmic information content. In this paper, we give a precise formal definition of effective complexity and rigorous proofs of its basic properties. In particular, we show that incompressible binary strings are effectively simple, and we prove the existence of strings that have effective complexity close to their lengths. Furthermore, we show that effective complexity is related to Bennett's logical depth: If the effective complexity of a string xx exceeds a certain explicit threshold then that string must have astronomically large depth; otherwise, the depth can be arbitrarily small.Comment: 14 pages, 2 figure

    Gauge Theory for Finite-Dimensional Dynamical Systems

    Full text link
    Gauge theory is a well-established concept in quantum physics, electrodynamics, and cosmology. This theory has recently proliferated into new areas, such as mechanics and astrodynamics. In this paper, we discuss a few applications of gauge theory in finite-dimensional dynamical systems with implications to numerical integration of differential equations. We distinguish between rescriptive and descriptive gauge symmetry. Rescriptive gauge symmetry is, in essence, re-scaling of the independent variable, while descriptive gauge symmetry is a Yang-Mills-like transformation of the velocity vector field, adapted to finite-dimensional systems. We show that a simple gauge transformation of multiple harmonic oscillators driven by chaotic processes can render an apparently "disordered" flow into a regular dynamical process, and that there exists a remarkable connection between gauge transformations and reduction theory of ordinary differential equations. Throughout the discussion, we demonstrate the main ideas by considering examples from diverse engineering and scientific fields, including quantum mechanics, chemistry, rigid-body dynamics and information theory

    Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics

    Full text link
    In this philosophical paper, we explore computational and biological analogies to address the fine-tuning problem in cosmology. We first clarify what it means for physical constants or initial conditions to be fine-tuned. We review important distinctions such as the dimensionless and dimensional physical constants, and the classification of constants proposed by Levy-Leblond. Then we explore how two great analogies, computational and biological, can give new insights into our problem. This paper includes a preliminary study to examine the two analogies. Importantly, analogies are both useful and fundamental cognitive tools, but can also be misused or misinterpreted. The idea that our universe might be modelled as a computational entity is analysed, and we discuss the distinction between physical laws and initial conditions using algorithmic information theory. Smolin introduced the theory of "Cosmological Natural Selection" with a biological analogy in mind. We examine an extension of this analogy involving intelligent life. We discuss if and how this extension could be legitimated. Keywords: origin of the universe, fine-tuning, physical constants, initial conditions, computational universe, biological universe, role of intelligent life, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres

    Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    Get PDF
    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program
    corecore