8 research outputs found

    Quantitative Models and Implicit Complexity

    Full text link
    We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Elementary Affine Logic, LFPL (a language for polytime computation close to realistic functional programming introduced by one of us), Light Affine Logic and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logi cs, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL thus allowing for an internal definition of inductive datatypes.Comment: 29 page

    An Invariant Cost Model for the Lambda Calculus

    Full text link
    We define a new cost model for the call-by-value lambda-calculus satisfying the invariance thesis. That is, under the proposed cost model, Turing machines and the call-by-value lambda-calculus can simulate each other within a polynomial time overhead. The model only relies on combinatorial properties of usual beta-reduction, without any reference to a specific machine or evaluator. In particular, the cost of a single beta reduction is proportional to the difference between the size of the redex and the size of the reduct. In this way, the total cost of normalizing a lambda term will take into account the size of all intermediate results (as well as the number of steps to normal form).Comment: 19 page

    A quantitative model for simply typed λ-calculus

    Get PDF
    We use a simplified version of the framework of resource monoids, introduced by Dal Lago and Hofmann (2005, 2011), to interpret simply typed λ -calculus with constants zero and successor. We then use this model to prove a simple quantitative result about bounding the size of the normal form of λ -terms. While the bound itself is already known, this is to our knowledge the first semantic proof of this fact. Our use of resource monoids differs from the other instances found in the literature, in that it measures the size of λ -terms rather than time complexity

    Homotopy Type-Theoretic Interpretations of Constructive Set Theories

    Get PDF
    This thesis deals primarily with type-theoretic interpretations of constructive set theories using notions and ideas from homotopy type theory. We introduce a family of interpretations [.]_k,h for 2 ≤ k ≤ ∞ and 1 ≤ h ≤ ∞ of the set theory BCS into the type theory H, in which sets and formulas are interpreted respectively as types of homotopy level k and h. Depending on the values of the parameters k and h we are able to interpret different theories, like Aczel's CZF and Myhill's CST. We relate the family [.]_k,h to the other interpretations of CST into homotopy type theory already studied in the literature in [UFP13] and [Gy16a]. We characterise a class of sentences valid in the interpretations [.]_k,∞ in terms of the ΠΣ axiom of choice, generalising the characterisation of [RT06] for Aczel's interpretation. We also define a proposition-as-hproposition interpretation in the context of logic-enriched type theories. The formulas valid in this interpretation are then characterised in terms of the axiom of unique choice. We extend the analysis of Aczel's interpretation provided in [GA06] to the interpretations of CST into homotopy type theory, providing a comparative analysis. This is done formulating in the logic-enriched type theory the key principles used in the proofs of the two interpretations. We also investigate the notion of feasible ordinal formalised in the context of a linear type theory equipped with a type of resources. This type theory was originally introduced by Hofmann in [Hof03]. We disprove Hofmann's conjecture on the definable ordinals, by showing that for any given k ϵ N the ordinal ω^k is definable

    Lambda calculi and logics for quantum computing

    Get PDF
    In questa tesi proponiamo diversi risultati originali riguardo i lambda calcoli e le logiche per le computazioni quantistiche. Il lavoro `e diviso in tre parti. Nella prima parte richiamiamo alcune nozioni fondamentali di algebra lineare, logica e computazione quantistica. La seconda parte volge l\u2019attenzione ai lambda calcoli quantistici. Introdurremo dapprima Q, un lambda calcolo quantistico con controllo classico. Studieremo le sue proprie`a classiche, come la confluenza e la Subject Reduction, proseguendo poi con un\u2019importante propriet`a quantistica, chiamata standardizzazione. In seguito sar`a studiato il potere espressivo di Q, attraverso la provata equivalenza con il formalismo delle famiglie di circuiti quantistici. A partire da Q, sar`a poi definito e studiato il sottolinguaggio SQ, ispirato alla Soft Linear Logic ed intrinsecamente polytime. Sia Q sia SQ non hanno nella sintassi un operatore di misurazione, e quindi un\u2019implicita misurazione viene assunta alla fine delle computazioni. I problemi relativi alla misura sono studiati in un terzo lambda calcolo chiamato Q*, che estende Q con un operatore di misura. Partendo dall\u2019osservazione che un esplicito operatore di misura interrompe l\u2019evoluzione altrimenti deterministica del calcolo, importando un comportamento probabilistico, sono stati definiti dei nuovi strumenti tecnici quali le computazioni probabilistiche e gli stati misti. Proveremo un forte teorema di confluenza, valido anche nell\u2019importante caso delle computazioni infinite. Nella terza parte della tesi studieremo invece due sistemi modali etichettati, chiamati rispettivamente MSQS e MSpQS, che permettono di ragionare qualitativamente sulle computazioni quantistiche. I due sistemi rappresentano un possibile punto di partenza verso un nuovo modello per ragionare qualitativamente sulle trasformazioni computazionali degli stati quantistici, viste come modelli di Kripke. 1In this thesis we propose several original results about lambda calculi and logics for quantum computing. The work is divided into three parts. The first one is devoted to recall the main notions about linear algebra, logics and quantum computing. The second and main part focalizes on quantum lambda calculi. We start with Q, a quantum lambda calculus with classical control. We study its classical properties, such as confluence and Subject Reduction. We go on with an important quantum property of Q, called standardization, and successively, we study the expressive power of the proposed calculus, by proving the equivalence with the computational model of quantum circuit families. From the calculus Q, subsequently a sublanguage of Q called SQ is defined and studied: SQ is inspired to the Soft Linear Logic and it is a quantum lambda calculus intrinsically poly-time. Since Q and SQ have not an explicit measurement operator in the syntax, an implicit measurement at the end of the computations is assumed. Measurement problems are explicitly studied in a third quantum lambda calculus called Q*, an extension of Q with a measurement operator. Starting from the observation that an explicit measurement operator breaks the deterministic evolution of the computation by importing a probabilistic behavior, new technical instruments, such as the probabilistic computations and the mixed states are defined. We prove a confluence result for the calculus, also for the relevant case of infinite computations. In the last part of the thesis, we propose two labeled modal deduction systems able to describe quantum computations from a qualitative point of view. The two systems, called respectively MSQS and MSpQS, represent a starting point toward a new model to deal (in a qualitative way) with computational quantum structures, seen as Kripke models.

    Polynomial Time Calculi

    Get PDF
    This dissertation deals with type systems which guarantee polynomial time complexity of typed programs. Such algorithms are commonly regarded as being feasible for practical applications, because their runtime grows reasonably fast for bigger inputs. The implicit complexity community has proposed several type systems for polynomial time in the recent years, each with strong, but different structural restrictions on the permissible algorithms which are necessary to control complexity. Comparisons between the various approaches are hard and this has led to a landscape of islands in the literature of expressible algorithms in each calculus, without many known links between them. This work chooses Light Affine Logic (LAL) and Hofmann's LFPL, both linearly typed, and studies the connections between them. It is shown that the light iteration in LAL, the fixed point variant of LAL, is expressive enough to allow a (non-trivial) compositional embedding of LFPL. The pull-out trick of LAL is identified as a technique to type certain non-size-increasing algorithms in such a way that they can be iterated. The System T sibling of LAL is developed which seamlessly integrates this technique as a central feature of the iteration scheme and which is proved again correct and complete for polynomial time. Because -iterations of the same level cannot be nested, is further generalised to , which surprisingly can express the impredicative iteration of LFPL and the light iteration of at the same time. Therefore, it subsumes both systems in one, while still being polynomial time normalisable. Hence, this result gives the first bridge between these two islands of implicit computational complexity

    The monitoring power of forcing program transformations

    Get PDF
    In this thesis, we are interested in semantical proofs of correctness results for complex programming languages. In particular, we advocate the need for a theoretical framework that allows one to:- design realizability semantics using basic blocks - use algebraic constructions to combine those blocks As a step towards this goal, we propose a new semantical framework, based on the composition of linear variants of Krivine realizability and Cohen forcing. The first ingredient of this framework is the Monitoring Abstract Machine: a computing environment that possesses special memory cells used to monitor the execution of programs, in the style of Miquel's KFAM. It is shown how this new machine emerges from a linear forcing program transformation. We then introduce the central notion of Monitoring Algebra and the associated realizability interpretation. Different monitoring algebras induce sound semantics of different programming languages. We then present an algebraic construction to combine different Monitoring Algebras (and the associated programming languages) based on the technique of forcing iteration. We present various results and first applications of our theory. We show that the forcing structure can be used to represent the consumption of resources, in particular time, but also step-indexing or the use of higher-order references. We finally apply our results to retrieve three complex soundness results:- we give the first semantical proof of the consistency of a contraction-free naive set theory, originally introduced by Grishin- we use our framework to obtain a polynomial time termination result for a light-logic based programming language featuring recursive types - we prove the soundness of a language with references that supports strong updates, based on a linear type system inspired by a work of Ahmed, Fluet and Morrisett.Dans cette thèse, nous nous intéressons aux preuves sémantiques de résultats de corrections pour des langages de programmation complexes. En particulier, nous mettons en évidencele besoin d'un nouveau cadre théorique permettant de:- concevoir des sémantiques de réalisabilité à partir de briques plus élémentaires.- combiner ces briques élémentaires grâce à des constructions algébriques.- prouver des théorèmes généraux réutilisables lors de preuves futures de correctionde langages de programmation. Nous proposons dans ce manuscrit un tel cadre sémantique, basé sur la composition de variantes linéaires de la réalisabilité de Krivine et du forcing de Cohen. Le premier ingrédient est la Monitoring Abstract Machine: un environnement de calcul qui utilise des cases mémoires réservées pour "surveiller" l'exécution des programmes, dans le style de la KFAM introduite par Miquel. Cette machine émerge naturellement d'une transformation de programme basée sur une variante linéaire du forcing de Cohen. Nous introduisons par la suite la notion centrale d'Algèbre de Monitoring et le modèle de réalisabilité associé. Chaque algèbre de monitoring induit une sémantique correcte pour un langage de programmation associé. Point crucial de cette thèse, nous définissons, en se basant sur la technique du forcing itéré, une construction algébrique permettant de combiner plusieurs algèbres de monitoring. Nous développons de nombreux résultats élémentaires à propos de notre théorie. En particulier, nous montrons que la structure de forcing peut être utilisée pour représenter la consommation de ressources (en particulier le temps), le step-indexing ou encore des références d'ordre supérieur. Finalement, nous appliquons notre théorie pour obtenir trois preuves de correction complexes:- nous donnons la première preuve sémantique connue de la cohérence d'une théorie des ensembles naïve sans contraction introduite originellement par Grishin dans les années 70- nous utilisons notre cadre pour obtenir un résultat de terminaison en temps polynomial pour un langage de programmation avec types récursifs basé sur une logique light- nous reprouvons la correction d'un langage avec références d'ordre supérieur et mise à jour forte, inspiré par un système de type introduit par Ahmed, Fluet et Morrisett

    Quantitative Models and Implicit Complexity

    No full text
    We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Elementary Affine Logic, LFPL (a language for polytime computation close to realistic functional programming introduced by one of us), Light Affine Logic and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logics, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL thus allowing for an internal definition of inductive datatypes.
    corecore