678 research outputs found

    First Order Theories of Some Lattices of Open Sets

    Full text link
    We show that the first order theory of the lattice of open sets in some natural topological spaces is mm-equivalent to second order arithmetic. We also show that for many natural computable metric spaces and computable domains the first order theory of the lattice of effectively open sets is undecidable. Moreover, for several important spaces (e.g., Rn\mathbb{R}^n, n≥1n\geq1, and the domain PωP\omega) this theory is mm-equivalent to first order arithmetic

    Parameterized Uniform Complexity in Numerics: from Smooth to Analytic, from NP-hard to Polytime

    Full text link
    The synthesis of classical Computational Complexity Theory with Recursive Analysis provides a quantitative foundation to reliable numerics. Here the operators of maximization, integration, and solving ordinary differential equations are known to map (even high-order differentiable) polynomial-time computable functions to instances which are `hard' for classical complexity classes NP, #P, and CH; but, restricted to analytic functions, map polynomial-time computable ones to polynomial-time computable ones -- non-uniformly! We investigate the uniform parameterized complexity of the above operators in the setting of Weihrauch's TTE and its second-order extension due to Kawamura&Cook (2010). That is, we explore which (both continuous and discrete, first and second order) information and parameters on some given f is sufficient to obtain similar data on Max(f) and int(f); and within what running time, in terms of these parameters and the guaranteed output precision 2^(-n). It turns out that Gevrey's hierarchy of functions climbing from analytic to smooth corresponds to the computational complexity of maximization growing from polytime to NP-hard. Proof techniques involve mainly the Theory of (discrete) Computation, Hard Analysis, and Information-Based Complexity

    Global semantic typing for inductive and coinductive computing

    Get PDF
    Inductive and coinductive types are commonly construed as ontological (Church-style) types, denoting canonical data-sets such as natural numbers, lists, and streams. For various purposes, notably the study of programs in the context of global semantics, it is preferable to think of types as semantical properties (Curry-style). Intrinsic theories were introduced in the late 1990s to provide a purely logical framework for reasoning about programs and their semantic types. We extend them here to data given by any combination of inductive and coinductive definitions. This approach is of interest because it fits tightly with syntactic, semantic, and proof theoretic fundamentals of formal logic, with potential applications in implicit computational complexity as well as extraction of programs from proofs. We prove a Canonicity Theorem, showing that the global definition of program typing, via the usual (Tarskian) semantics of first-order logic, agrees with their operational semantics in the intended model. Finally, we show that every intrinsic theory is interpretable in a conservative extension of first-order arithmetic. This means that quantification over infinite data objects does not lead, on its own, to proof-theoretic strength beyond that of Peano Arithmetic. Intrinsic theories are perfectly amenable to formulas-as-types Curry-Howard morphisms, and were used to characterize major computational complexity classes Their extensions described here have similar potential which has already been applied

    On the necessity of complexity

    Full text link
    Wolfram's Principle of Computational Equivalence (PCE) implies that universal complexity abounds in nature. This paper comprises three sections. In the first section we consider the question why there are so many universal phenomena around. So, in a sense, we week a driving force behind the PCE if any. We postulate a principle GNS that we call the Generalized Natural Selection Principle that together with the Church-Turing Thesis is seen to be equivalent to a weak version of PCE. In the second section we ask the question why we do not observe any phenomena that are complex but not-universal. We choose a cognitive setting to embark on this question and make some analogies with formal logic. In the third and final section we report on a case study where we see rich structures arise everywhere.Comment: 17 pages, 3 figure

    Feedback computability on Cantor space

    Full text link
    We introduce the notion of feedback computable functions from 2ω2^\omega to 2ω2^\omega, extending feedback Turing computation in analogy with the standard notion of computability for functions from 2ω2^\omega to 2ω2^\omega. We then show that the feedback computable functions are precisely the effectively Borel functions. With this as motivation we define the notion of a feedback computable function on a structure, independent of any coding of the structure as a real. We show that this notion is absolute, and as an example characterize those functions that are computable from a Gandy ordinal with some finite subset distinguished

    Basic notions of universal algebra for language theory and graph grammars

    Get PDF
    AbstractThis paper reviews the basic properties of the equational and recognizable subsets of general algebras; these sets can be seen as generalizations of the context-free and regular languages, respectively. This approach, based on Universal Algebra, facilitates the development of the theory of formal languages so as to include the description of sets of finite trees, finite graphs, finite hypergraphs, tuples of words, partially commutative words (also called traces) and other similar finite objects
    • …
    corecore