32 research outputs found

    Effectiveness of Hindman's theorem for bounded sums

    Full text link
    We consider the strength and effective content of restricted versions of Hindman's Theorem in which the number of colors is specified and the length of the sums has a specified finite bound. Let HTkn\mathsf{HT}^{\leq n}_k denote the assertion that for each kk-coloring cc of N\mathbb{N} there is an infinite set XNX \subseteq \mathbb{N} such that all sums xFx\sum_{x \in F} x for FXF \subseteq X and 0<Fn0 < |F| \leq n have the same color. We prove that there is a computable 22-coloring cc of N\mathbb{N} such that there is no infinite computable set XX such that all nonempty sums of at most 22 elements of XX have the same color. It follows that HT22\mathsf{HT}^{\leq 2}_2 is not provable in RCA0\mathsf{RCA}_0 and in fact we show that it implies SRT22\mathsf{SRT}^2_2 in RCA0\mathsf{RCA}_0. We also show that there is a computable instance of HT33\mathsf{HT}^{\leq 3}_3 with all solutions computing 00'. The proof of this result shows that HT33\mathsf{HT}^{\leq 3}_3 implies ACA0\mathsf{ACA}_0 in RCA0\mathsf{RCA}_0

    Degree spectra for transcendence in fields

    Full text link
    We show that for both the unary relation of transcendence and the finitary relation of algebraic independence on a field, the degree spectra of these relations may consist of any single computably enumerable Turing degree, or of those c.e. degrees above an arbitrary fixed Δ20\Delta^0_2 degree. In other cases, these spectra may be characterized by the ability to enumerate an arbitrary Σ20\Sigma^0_2 set. This is the first proof that a computable field can fail to have a computable copy with a computable transcendence basis

    A Hierarchy of Polynomial Kernels

    Full text link
    In parameterized algorithmics, the process of kernelization is defined as a polynomial time algorithm that transforms the instance of a given problem to an equivalent instance of a size that is limited by a function of the parameter. As, afterwards, this smaller instance can then be solved to find an answer to the original question, kernelization is often presented as a form of preprocessing. A natural generalization of kernelization is the process that allows for a number of smaller instances to be produced to provide an answer to the original problem, possibly also using negation. This generalization is called Turing kernelization. Immediately, questions of equivalence occur or, when is one form possible and not the other. These have been long standing open problems in parameterized complexity. In the present paper, we answer many of these. In particular, we show that Turing kernelizations differ not only from regular kernelization, but also from intermediate forms as truth-table kernelizations. We achieve absolute results by diagonalizations and also results on natural problems depending on widely accepted complexity theoretic assumptions. In particular, we improve on known lower bounds for the kernel size of compositional problems using these assumptions

    Construire les fonctions récursives totales en Coq

    Get PDF
    International audienceWe present a (relatively) short mechanized proof that Coq types any recursive function which is provably total in Coq. The well-founded (and terminating) induction scheme, which is the foundation of Coq recursion, is maximal. We implement an unbounded minimization scheme for decidable predicates. It can also be used to reify a whole category of undecidable predicates. This development is purely constructive and requires no axiom. Hence it can be integrated into any project that might assume additional axioms

    On Realization of Index Sets in 10 {\prod}_1^0 -Classes

    No full text

    On Splits of Computably Enumerable Sets

    No full text

    A Hierarchy of Polynomial Kernels

    Get PDF
    In parameterized algorithmics the process of kernelization is defined as a polynomial time algorithm that transforms the instance of a given problem to an equivalent instance of a size that is limited by a function of the parameter. As, afterwards, this smaller instance can then be solved to find an answer to the original question, kernelization is often presented as a form of preprocessing. A natural generalization of kernelization is the process that allows for a number of smaller instances to be produced to provide an answer to the original problem, possibly also using negation. This generalization is called Turing kernelization. Immediately, questions of equivalence occur or, when is one form possible and not the other. These have been long standing open problems in parameterized complexity. In the present paper, we answer many of these. In particular we show that Turing kernelizations differ not only from regular kernelization, but also from intermediate forms as truth-table kernelizations. We achieve absolute results by diagonalizations and also results on natural problems depending on widely accepted complexity theoretic assumptions. In particular, we improve on known lower bounds for the kernel size of compositional problems using these assumptions
    corecore