42 research outputs found

    Phase Transition and Strong Predictability

    Full text link
    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed in our former work [K. Tadaki, Local Proceedings of CiE 2008, pp.425-434, 2008], where we introduced the notion of thermodynamic quantities into AIT. These quantities are real functions of temperature T>0. The values of all the thermodynamic quantities diverge when T exceeds 1. This phenomenon corresponds to phase transition in statistical mechanics. In this paper we introduce the notion of strong predictability for an infinite binary sequence and then apply it to the partition function Z(T), which is one of the thermodynamic quantities in AIT. We then reveal a new computational aspect of the phase transition in AIT by showing the critical difference of the behavior of Z(T) between T=1 and T<1 in terms of the strong predictability for the base-two expansion of Z(T).Comment: 5 pages, LaTeX2e, no figure

    On the necessity of complexity

    Full text link
    Wolfram's Principle of Computational Equivalence (PCE) implies that universal complexity abounds in nature. This paper comprises three sections. In the first section we consider the question why there are so many universal phenomena around. So, in a sense, we week a driving force behind the PCE if any. We postulate a principle GNS that we call the Generalized Natural Selection Principle that together with the Church-Turing Thesis is seen to be equivalent to a weak version of PCE. In the second section we ask the question why we do not observe any phenomena that are complex but not-universal. We choose a cognitive setting to embark on this question and make some analogies with formal logic. In the third and final section we report on a case study where we see rich structures arise everywhere.Comment: 17 pages, 3 figure

    Turing machines can be efficiently simulated by the General Purpose Analog Computer

    Full text link
    The Church-Turing thesis states that any sufficiently powerful computational model which captures the notion of algorithm is computationally equivalent to the Turing machine. This equivalence usually holds both at a computability level and at a computational complexity level modulo polynomial reductions. However, the situation is less clear in what concerns models of computation using real numbers, and no analog of the Church-Turing thesis exists for this case. Recently it was shown that some models of computation with real numbers were equivalent from a computability perspective. In particular it was shown that Shannon's General Purpose Analog Computer (GPAC) is equivalent to Computable Analysis. However, little is known about what happens at a computational complexity level. In this paper we shed some light on the connections between this two models, from a computational complexity level, by showing that, modulo polynomial reductions, computations of Turing machines can be simulated by GPACs, without the need of using more (space) resources than those used in the original Turing computation, as long as we are talking about bounded computations. In other words, computations done by the GPAC are as space-efficient as computations done in the context of Computable Analysis

    Topologies Refining the Cantor Topology on X ω

    Full text link
    International audienceThe space of one-sided infinite words plays a crucial rôle in several parts of Theoretical Computer Science. Usually, it is convenient to regard this space as a metric space, the Cantor-space. It turned out that for several purposes topologies other than the one of the Cantor-space are useful, e.g. for studying fragments of first-order logic over infinite words or for a topological characterisation of random infinite words. It is shown that both of these topologies refine the topology of the Cantor-space. Moreover, from common features of these topologies we extract properties which characterise a large class of topologies. It turns out that, for this general class of topologies, the corresponding closure and interior operators respect the shift operations and also, to some respect, the definability of sets of infinite words by finite automata

    Constructive Dimension and Turing Degrees

    Full text link
    This paper examines the constructive Hausdorff and packing dimensions of Turing degrees. The main result is that every infinite sequence S with constructive Hausdorff dimension dim_H(S) and constructive packing dimension dim_P(S) is Turing equivalent to a sequence R with dim_H(R) <= (dim_H(S) / dim_P(S)) - epsilon, for arbitrary epsilon > 0. Furthermore, if dim_P(S) > 0, then dim_P(R) >= 1 - epsilon. The reduction thus serves as a *randomness extractor* that increases the algorithmic randomness of S, as measured by constructive dimension. A number of applications of this result shed new light on the constructive dimensions of Turing degrees. A lower bound of dim_H(S) / dim_P(S) is shown to hold for the Turing degree of any sequence S. A new proof is given of a previously-known zero-one law for the constructive packing dimension of Turing degrees. It is also shown that, for any regular sequence S (that is, dim_H(S) = dim_P(S)) such that dim_H(S) > 0, the Turing degree of S has constructive Hausdorff and packing dimension equal to 1. Finally, it is shown that no single Turing reduction can be a universal constructive Hausdorff dimension extractor, and that bounded Turing reductions cannot extract constructive Hausdorff dimension. We also exhibit sequences on which weak truth-table and bounded Turing reductions differ in their ability to extract dimension.Comment: The version of this paper appearing in Theory of Computing Systems, 45(4):740-755, 2009, had an error in the proof of Theorem 2.4, due to insufficient care with the choice of delta. This version modifies that proof to fix the error

    Membrane Systems and Hypercomputation

    Get PDF
    We present a brief analysis of hypercomputation and its relationship to membrane systems theory, including a re-evaluation of Turing’s analysis of computation and the importance of timing structure, and suggest a ‘cosmological’ variant of tissue P systems that is capable of super-Turing behaviour. No prior technical background in hypercomputation theory is assumed

    G\"odel Incompleteness and the Black Hole Information Paradox

    Full text link
    Semiclassical reasoning suggests that the process by which an object collapses into a black hole and then evaporates by emitting Hawking radiation may destroy information, a problem often referred to as the black hole information paradox. Further, there seems to be no unique prediction of where the information about the collapsing body is localized. We propose that the latter aspect of the paradox may be a manifestation of an inconsistent self-reference in the semiclassical theory of black hole evolution. This suggests the inadequacy of the semiclassical approach or, at worst, that standard quantum mechanics and general relavity are fundamentally incompatible. One option for the resolution for the paradox in the localization is to identify the G\"odel-like incompleteness that corresponds to an imposition of consistency, and introduce possibly new physics that supplies this incompleteness. Another option is to modify the theory in such a way as to prohibit self-reference. We discuss various possible scenarios to implement these options, including eternally collapsing objects, black hole remnants, black hole final states, and simple variants of semiclassical quantum gravity.Comment: 14 pages, 2 figures; revised according to journal requirement

    A proof of the Geroch-Horowitz-Penrose formulation of the strong cosmic censor conjecture motivated by computability theory

    Full text link
    In this paper we present a proof of a mathematical version of the strong cosmic censor conjecture attributed to Geroch-Horowitz and Penrose but formulated explicitly by Wald. The proof is based on the existence of future-inextendible causal curves in causal pasts of events on the future Cauchy horizon in a non-globally hyperbolic space-time. By examining explicit non-globally hyperbolic space-times we find that in case of several physically relevant solutions these future-inextendible curves have in fact infinite length. This way we recognize a close relationship between asymptotically flat or anti-de Sitter, physically relevant extendible space-times and the so-called Malament-Hogarth space-times which play a central role in recent investigations in the theory of "gravitational computers". This motivates us to exhibit a more sharp, more geometric formulation of the strong cosmic censor conjecture, namely "all physically relevant, asymptotically flat or anti-de Sitter but non-globally hyperbolic space-times are Malament-Hogarth ones". Our observations may indicate a natural but hidden connection between the strong cosmic censorship scenario and the Church-Turing thesis revealing an unexpected conceptual depth beneath both conjectures.Comment: 16pp, LaTeX, no figures. Final published versio
    corecore