32 research outputs found
Effectiveness of Hindman's theorem for bounded sums
We consider the strength and effective content of restricted versions of
Hindman's Theorem in which the number of colors is specified and the length of
the sums has a specified finite bound. Let denote the
assertion that for each -coloring of there is an infinite
set such that all sums for and have the same color. We prove that there is a
computable -coloring of such that there is no infinite
computable set such that all nonempty sums of at most elements of
have the same color. It follows that is not provable
in and in fact we show that it implies in
. We also show that there is a computable instance of
with all solutions computing . The proof of this
result shows that implies in
Degree spectra for transcendence in fields
We show that for both the unary relation of transcendence and the finitary
relation of algebraic independence on a field, the degree spectra of these
relations may consist of any single computably enumerable Turing degree, or of
those c.e. degrees above an arbitrary fixed degree. In other
cases, these spectra may be characterized by the ability to enumerate an
arbitrary set. This is the first proof that a computable field can
fail to have a computable copy with a computable transcendence basis
A Hierarchy of Polynomial Kernels
In parameterized algorithmics, the process of kernelization is defined as a
polynomial time algorithm that transforms the instance of a given problem to an
equivalent instance of a size that is limited by a function of the parameter.
As, afterwards, this smaller instance can then be solved to find an answer to
the original question, kernelization is often presented as a form of
preprocessing. A natural generalization of kernelization is the process that
allows for a number of smaller instances to be produced to provide an answer to
the original problem, possibly also using negation. This generalization is
called Turing kernelization. Immediately, questions of equivalence occur or,
when is one form possible and not the other. These have been long standing open
problems in parameterized complexity. In the present paper, we answer many of
these. In particular, we show that Turing kernelizations differ not only from
regular kernelization, but also from intermediate forms as truth-table
kernelizations. We achieve absolute results by diagonalizations and also
results on natural problems depending on widely accepted complexity theoretic
assumptions. In particular, we improve on known lower bounds for the kernel
size of compositional problems using these assumptions
Construire les fonctions récursives totales en Coq
International audienceWe present a (relatively) short mechanized proof that Coq types any recursive function which is provably total in Coq. The well-founded (and terminating) induction scheme, which is the foundation of Coq recursion, is maximal. We implement an unbounded minimization scheme for decidable predicates. It can also be used to reify a whole category of undecidable predicates. This development is purely constructive and requires no axiom. Hence it can be integrated into any project that might assume additional axioms
A Hierarchy of Polynomial Kernels
In parameterized algorithmics the process of kernelization is defined as a polynomial time algorithm that transforms the instance of a given problem to an equivalent instance of a size that is limited by a function of the parameter. As, afterwards, this smaller instance can then be solved to find an answer to the original question, kernelization is often presented as a form of preprocessing. A natural generalization of kernelization is the process that allows for a number of smaller instances to be produced to provide an answer to the original problem, possibly also using negation. This generalization is called Turing kernelization. Immediately, questions of equivalence occur or, when is one form possible and not the other. These have been long standing open problems in parameterized complexity. In the present paper, we answer many of these. In particular we show that Turing kernelizations differ not only from regular kernelization, but also from intermediate forms as truth-table kernelizations. We achieve absolute results by diagonalizations and also results on natural problems depending on widely accepted complexity theoretic assumptions. In particular, we improve on known lower bounds for the kernel size of compositional problems using these assumptions