715 research outputs found

    On tiered small jump operators

    Full text link
    Predicative analysis of recursion schema is a method to characterize complexity classes like the class FPTIME of polynomial time computable functions. This analysis comes from the works of Bellantoni and Cook, and Leivant by data tiering. Here, we refine predicative analysis by using a ramified Ackermann's construction of a non-primitive recursive function. We obtain a hierarchy of functions which characterizes exactly functions, which are computed in O(n^k) time over register machine model of computation. For this, we introduce a strict ramification principle. Then, we show how to diagonalize in order to obtain an exponential function and to jump outside deterministic polynomial time. Lastly, we suggest a dependent typed lambda-calculus to represent this construction

    Context Semantics, Linear Logic and Computational Complexity

    Full text link
    We show that context semantics can be fruitfully applied to the quantitative analysis of proof normalization in linear logic. In particular, context semantics lets us define the weight of a proof-net as a measure of its inherent complexity: it is both an upper bound to normalization time (modulo a polynomial overhead, independently on the reduction strategy) and a lower bound to the number of steps to normal form (for certain reduction strategies). Weights are then exploited in proving strong soundness theorems for various subsystems of linear logic, namely elementary linear logic, soft linear logic and light linear logic.Comment: 22 page

    An Invariant Cost Model for the Lambda Calculus

    Full text link
    We define a new cost model for the call-by-value lambda-calculus satisfying the invariance thesis. That is, under the proposed cost model, Turing machines and the call-by-value lambda-calculus can simulate each other within a polynomial time overhead. The model only relies on combinatorial properties of usual beta-reduction, without any reference to a specific machine or evaluator. In particular, the cost of a single beta reduction is proportional to the difference between the size of the redex and the size of the reduct. In this way, the total cost of normalizing a lambda term will take into account the size of all intermediate results (as well as the number of steps to normal form).Comment: 19 page

    From Normal Functors to Logarithmic Space Queries

    Get PDF
    We introduce a new approach to implicit complexity in linear logic, inspired by functional database query languages and using recent developments in effective denotational semantics of polymorphism. We give the first sub-polynomial upper bound in a type system with impredicative polymorphism; adding restrictions on quantifiers yields a characterization of logarithmic space, for which extensional completeness is established via descriptive complexity

    Elementary linear logic revisited for polynomial time and an exponential time hierarchy (extended version)

    Get PDF
    Nombre de pages: 20. Une version courte de ce travail est à paraître dans les actes de: Asian Symposium on Programming Languages and Systems (APLAS 2011).Elementary linear logic is a simple variant of linear logic, introduced by Girard and which characterizes in the proofs-as-programs approach the class of elementary functions (computable in time bounded by a tower of exponentials of fixed height). Our goal here is to show that despite its simplicity, elementary linear logic can nevertheless be used as a common framework to characterize the different levels of a hierarchy of deterministic time complexity classes, within elementary time. We consider a variant of this logic with type fixpoints and weakening. By selecting specific types we then characterize the class P of polynomial time predicates and more generally the hierarchy of classes k-EXP, for k>=0, where k-EXP is the union of DTIME(2_k^{n^i}), for i>=1

    Combining Linear Logic and Size Types for Implicit Complexity

    Get PDF
    Several type systems have been proposed to statically control the time complexity of lambda-calculus programs and characterize complexity classes such as FPTIME or FEXPTIME. A first line of research stems from linear logic and restricted versions of its !-modality controlling duplication. A second approach relies on the idea of tracking the size increase between input and output, and together with a restricted recursion scheme, to deduce time complexity bounds. However both approaches suffer from limitations : either a limited intensional expressivity, or linearity restrictions. In the present work we incorporate both approaches into a common type system, in order to overcome their respective constraints. Our system is based on elementary linear logic combined with linear size types, called sEAL, and leads to characterizations of the complexity classes FPTIME and 2k-FEXPTIME, for k >= 0

    Mathematische Logik

    Get PDF
    [no abstract available
    • …
    corecore