4 research outputs found

    The Speedup Theorem in a Primitive Recursive Framework

    Full text link
    Blum’s speedup theorem is a major theorem in computational com-plexity, showing the existence of computable functions for which no optimal program can exist: for any speedup function r there ex-ists a function fr such that for any program computing fr we can find an alternative program computing it with the desired speedup r. The main corollary is that algorithmic problems do not have, in general, a inherent complexity. Traditional proofs of the speedup theorem make an essential use of Kleene’s fix point theorem to close a suitable diagonal argument. As a consequence, very little is known about its validity in subrecursive settings, where there is no universal machine, and no fixpoints. In this article we discuss an alternative, formal proof of the speedup theorem that allows us to spare the invocation of the fix point theorem and sheds more light on the actual complexity of the function fr

    On primitive recursive algorithms and the greatest common divisor function

    No full text
    We establish linear lower bounds for the complexity of non-trivial, primitive recursive algorithms from piecewise linear given functions. The main corollary is that logtime algorithms for the greatest common divisor from such givens (such as Stein's) cannot be matched in efficiency by primitive recursive algorithms from the same given functions. The question is left open for the Euclidean algorithm, which assumes the remainder function. © 2002 Published by Elsevier Science B.V

    On primitive recursive algorithms and the greatest common divisor function

    Get PDF
    Abstract. We establish linear lower bounds for the complexity of non-trivial, primitive recursive algorithms from piecewise linear given functions. The main corollary is that logtime algorithms for the greatest common divisor from such givens (such as Stein’s) cannot be matched in efficiency by primitive recursive algorithms from the same given functions. The question is left open for the Euclidean algorithm, which assumes the remainder function. In 1991, Colson [3] 1 proved a remarkable theorem about the limitations of primitive recursive algorithms, which has the following consequence: Colson’s Corollary. If a primitive recursive derivation of min(x, y) is expressed faithfully in a programming language, then one of the two computations min(1, 1000) and min(1000, 1) will take at least 1000 steps. The point is that the natural algorithm which computes min(x, y) in O(min(x, y)) steps cannot be matched in efficiency by a primitive recursive program, even though min(x, y) is a primitive recursive function; and so, as a practical and (especially) a foundational matter, we need to consider “recursive schemes ” more general than primitive recursion, even if, ultimately, we are only interested in primitive recursive functions. In this paper we consider extensions of Colson’s Theorem which allow conditional definitions and especially calls to a rich variety of “given ” functions, whose values are produced on demand in constant time. Sample, easy to state, result: Corollary 20. Consider primitive-recursive-like derivations, which in addition to composition and primitive recursion allow definition by cases and calls to the following functions and (characteristic functions of) relations: x + y, x − · y, x ÷ 2, Parity(x), x = y, x < y For each such derivation of the greatest common divisor function gcd(x, y), there is a sequence of pairs {(xt, yt)} and a rational constant r> 0, such that limt(xt + yt) = ∞, and for all t, c ∗ (xt, yt) ≥ r(xt + yt)
    corecore