854 research outputs found

    Strong normalization of lambda-bar-mu-mu-tilde-calculus with explicit substitutions

    Get PDF
    International audienceThe lambda-bar-mu-mu-tilde-calculus, defined by Curien and Herbelin, is a variant of the lambda-mu-calculus that exhibits symmetries such as terms/contexts and call-by-name/call-by-value. Since it is a symmetric, and hence a non-deterministic calculus, usual proof techniques of normalization needs some adjustments to work in this setting. Here we prove the strong normalization (SN) of simply typed lambda-bar-mu-mu-tilde-calculus with explicit substitutions. For that purpose, we first prove SN of simply typed lambda-bar-mu-mu-tilde-calculus (by a variant of the reducibility technique from Barbanera and Berardi), then we formalize a proof technique of SN via PSN (preservation of strong normalization), and we prove PSN by the perpetuality technique, as formalized by Bonelli

    Strong Normalization for HA + EM1 by Non-Deterministic Choice

    Full text link
    We study the strong normalization of a new Curry-Howard correspondence for HA + EM1, constructive Heyting Arithmetic with the excluded middle on Sigma01-formulas. The proof-term language of HA + EM1 consists in the lambda calculus plus an operator ||_a which represents, from the viewpoint of programming, an exception operator with a delimited scope, and from the viewpoint of logic, a restricted version of the excluded middle. We give a strong normalization proof for the system based on a technique of "non-deterministic immersion".Comment: In Proceedings COS 2013, arXiv:1309.092

    Proving termination of evaluation for System F with control operators

    Full text link
    We present new proofs of termination of evaluation in reduction semantics (i.e., a small-step operational semantics with explicit representation of evaluation contexts) for System F with control operators. We introduce a modified version of Girard's proof method based on reducibility candidates, where the reducibility predicates are defined on values and on evaluation contexts as prescribed by the reduction semantics format. We address both abortive control operators (callcc) and delimited-control operators (shift and reset) for which we introduce novel polymorphic type systems, and we consider both the call-by-value and call-by-name evaluation strategies.Comment: In Proceedings COS 2013, arXiv:1309.092

    Normalizing the Taylor expansion of non-deterministic {\lambda}-terms, via parallel reduction of resource vectors

    Full text link
    It has been known since Ehrhard and Regnier's seminal work on the Taylor expansion of λ\lambda-terms that this operation commutes with normalization: the expansion of a λ\lambda-term is always normalizable and its normal form is the expansion of the B\"ohm tree of the term. We generalize this result to the non-uniform setting of the algebraic λ\lambda-calculus, i.e. λ\lambda-calculus extended with linear combinations of terms. This requires us to tackle two difficulties: foremost is the fact that Ehrhard and Regnier's techniques rely heavily on the uniform, deterministic nature of the ordinary λ\lambda-calculus, and thus cannot be adapted; second is the absence of any satisfactory generic extension of the notion of B\"ohm tree in presence of quantitative non-determinism, which is reflected by the fact that the Taylor expansion of an algebraic λ\lambda-term is not always normalizable. Our solution is to provide a fine grained study of the dynamics of β\beta-reduction under Taylor expansion, by introducing a notion of reduction on resource vectors, i.e. infinite linear combinations of resource λ\lambda-terms. The latter form the multilinear fragment of the differential λ\lambda-calculus, and resource vectors are the target of the Taylor expansion of λ\lambda-terms. We show the reduction of resource vectors contains the image of any β\beta-reduction step, from which we deduce that Taylor expansion and normalization commute on the nose. We moreover identify a class of algebraic λ\lambda-terms, encompassing both normalizable algebraic λ\lambda-terms and arbitrary ordinary λ\lambda-terms: the expansion of these is always normalizable, which guides the definition of a generalization of B\"ohm trees to this setting

    QPCF: higher order languages and quantum circuits

    Full text link
    qPCF is a paradigmatic quantum programming language that ex- tends PCF with quantum circuits and a quantum co-processor. Quantum circuits are treated as classical data that can be duplicated and manipulated in flexible ways by means of a dependent type system. The co-processor is essentially a standard QRAM device, albeit we avoid to store permanently quantum states in between two co-processor's calls. Despite its quantum features, qPCF retains the classic programming approach of PCF. We introduce qPCF syntax, typing rules, and its operational semantics. We prove fundamental properties of the system, such as Preservation and Progress Theorems. Moreover, we provide some higher-order examples of circuit encoding
    corecore