415 research outputs found
Relational parametricity for higher kinds
Reynoldsâ notion of relational parametricity has been extremely influential and well studied for polymorphic programming languages and type theories based on System F. The extension of relational parametricity to higher kinded polymorphism, which allows quantification over type operators as well as types, has not received as much attention. We present a model of relational parametricity for System FÏ, within the impredicative Calculus of Inductive Constructions, and show how it forms an instance of a general class of models defined by Hasegawa. We investigate some of the consequences of our model and show that it supports the definition of inductive types, indexed by an arbitrary kind, and with reasoning principles provided by initiality
On Irrelevance and Algorithmic Equality in Predicative Type Theory
Dependently typed programs contain an excessive amount of static terms which
are necessary to please the type checker but irrelevant for computation. To
separate static and dynamic code, several static analyses and type systems have
been put forward. We consider Pfenning's type theory with irrelevant
quantification which is compatible with a type-based notion of equality that
respects eta-laws. We extend Pfenning's theory to universes and large
eliminations and develop its meta-theory. Subject reduction, normalization and
consistency are obtained by a Kripke model over the typed equality judgement.
Finally, a type-directed equality algorithm is described whose completeness is
proven by a second Kripke model.Comment: 36 pages, superseds the FoSSaCS 2011 paper of the first author,
titled "Irrelevance in Type Theory with a Heterogeneous Equality Judgement
TypeâPreserving CPS Translation of ÎŁ and Î Types is Not Not Possible
International audienceDependently typed languages like Coq are used to specify and prove functional correctness of source programs,but what we ultimately need are guarantees about correctness of compiled code. By preserving dependenttypes through each compiler pass, we could preserve source-level specifications and correctness proofs intothe generated target-language programs. Unfortunately, type-preserving compilation of dependent types isnontrivial. In 2002, Barthe and Uustalu showed that type-preserving CPS is not possible for languages likeCoq. Specifically, they showed that for strong dependent pairs (ÎŁ types), the standard typed call-by-name CPSis not type preserving. They further proved that for dependent case analysis on sums, a class of typed CPStranslationsâincluding the standard translationâis not possible. In 2016, Morrisett noticed a similar problemwith the standard call-by-value CPS translation for dependent functions (Î types). In essence, the problem isthat the standard typed CPS translation by double-negation, in which computations are assigned types of theform (A â â„) â â„, disrupts the term/type equivalence that is used during type checking in a dependentlytyped language.In this paper, we prove that type-preserving CPS translation for dependently typed languages is not notpossible. We develop both call-by-name and call-by-value CPS translations from the Calculus of Constructionswith both Î and ÎŁ types (CC) to a dependently typed target language, and prove type preservation andcompiler correctness of each translation. Our target language is CC extended with an additional equivalencerule and an additional typing rule, which we prove consistent by giving a model in the extensional Calculus ofConstructions. Our key observation is that we can use a CPS translation that employs answer-type polymorphism,where CPS-translated computations have type âα.(A â α) â α. This type justifies, by a free theorem,the new equality rule in our target language and allows us to recover the term/type equivalences that CPStranslation disrupts. Finally, we conjecture that our translation extends to dependent case analysis on sums,despite the impossibility result, and provide a proof sketch
Functionality, Polymorphism, and Concurrency: A Mathematical Investigation of Programming Paradigms
The search for mathematical models of computational phenomena often leads to problems that are of independent mathematical interest. Selected problems of this kind are investigated in this thesis. First, we study models of the untyped lambda calculus. Although many familiar models are constructed by order-theoretic methods, it is also known that there are some models of the lambda calculus that cannot be non-trivially ordered. We show that the standard open and closed term algebras are unorderable. We characterize the absolutely unorderable T-algebras in any algebraic variety T. Here an algebra is called absolutely unorderable if it cannot be embedded in an orderable algebra. We then introduce a notion of finite models for the lambda calculus, contrasting the known fact that models of the lambda calculus, in the traditional sense, are always non-recursive. Our finite models are based on Plotkinâs syntactical models of reduction. We give a method for constructing such models, and some examples that show how finite models can yield useful information about terms. Next, we study models of typed lambda calculi. Models of the polymorphic lambda calculus can be divided into environment-style models, such as Bruce and Meyerâs non-strict set-theoretic models, and categorical models, such as Seelyâs interpretation in PL-categories. Reynolds has shown that there are no set-theoretic strict models. Following a different approach, we investigate a notion of non-strict categorical models. These provide a uniform framework in which one can describe various classes of non-strict models, including set-theoretic models with or without empty types, and Kripke-style models. We show that completeness theorems correspond to categorical representation theorems, and we reprove a completeness result by Meyer et al. on set-theoretic models of the simply-typed lambda calculus with possibly empty types. Finally, we study properties of asynchronous communication in networks of communicating processes. We formalize several notions of asynchrony independently of any particular concurrent process paradigm. A process is asynchronous if its input and/or output is filtered through a communication medium, such as a buffer or a queue, possibly with feedback. We prove that the behavior of asynchronous processes can be equivalently characterized by first-order axioms
Predicativity and parametric polymorphism of Brouwerian implication
A common objection to the definition of intuitionistic implication in the
Proof Interpretation is that it is impredicative. I discuss the history of that
objection, argue that in Brouwer's writings predicativity of implication is
ensured through parametric polymorphism of functions on species, and compare
this construal with the alternative approaches to predicative implication of
Goodman, Dummett, Prawitz, and Martin-L\"of.Comment: Added further references (Pistone, Poincar\'e, Tabatabai, Van Atten
Relational semantics of linear logic and higher-order model-checking
In this article, we develop a new and somewhat unexpected connection between
higher-order model-checking and linear logic. Our starting point is the
observation that once embedded in the relational semantics of linear logic, the
Church encoding of any higher-order recursion scheme (HORS) comes together with
a dual Church encoding of an alternating tree automata (ATA) of the same
signature. Moreover, the interaction between the relational interpretations of
the HORS and of the ATA identifies the set of accepting states of the tree
automaton against the infinite tree generated by the recursion scheme. We show
how to extend this result to alternating parity automata (APT) by introducing a
parametric version of the exponential modality of linear logic, capturing the
formal properties of colors (or priorities) in higher-order model-checking. We
show in particular how to reunderstand in this way the type-theoretic approach
to higher-order model-checking developed by Kobayashi and Ong. We briefly
explain in the end of the paper how his analysis driven by linear logic results
in a new and purely semantic proof of decidability of the formulas of the
monadic second-order logic for higher-order recursion schemes.Comment: 24 pages. Submitte
Type Abstraction for Relaxed Noninterference
Information-flow security typing statically prevents confidential information to leak to public channels. The fundamental information flow property, known as noninterference, states that a public observer cannot learn anything from private data. As attractive as it is from a theoretical viewpoint, noninterference is impractical: real systems need to intentionally declassify some information, selectively. Among the different information flow approaches to declassification, a particularly expressive approach was proposed by Li and Zdancewic, enforcing a notion of relaxed noninterference by allowing programmers to specify declassification policies that capture the intended manner in which public information can be computed from private data.
This paper shows how we can exploit the familiar notion of type abstraction to support expressive declassification policies in a simpler, yet more expressive manner. In particular, the type-based approach to declassification---which we develop in an object-oriented setting---addresses several issues and challenges with respect to prior work, including a simple notion of label ordering based on subtyping, support for recursive declassification policies, and a local, modular reasoning principle for relaxed noninterference. This work paves the way for integrating declassification policies in practical security-typed languages
Logical ambiguity
The thesis presents research in the field of model theoretic semantics on the problem of ambiguity,
especially as it arises for sentences that contain junctions (and,or) and quantifiers (every man,
a woman). A number of techniques that have been proposed are surveyed, and I conclude
that these ought to be rejected because they do not make ambiguity 'emergent': they all have
the feature that subtheories would be able to explain all syntactic facts yet would predict no
ambiguity. In other words these accounts have a special purpose mechanism for generating
ambiguities.It is argued that categorial grammars show promise for giving an 'emergent' account. This is
because the only way to take a subtheory of a particular categorial grammar is by changing one
of the small number of clauses by which the categorial grammar axiomatises an infinite set of
syntactic rules, and such a change is likely to have a wider range of effects on the coverage of
the grammar than simply the subtraction of ambiguity.Of categorial grammars proposed to date the most powerful is Lambek Categorial Grammar,
which defines the set of syntactic rules by a notational variant of Gentzen's sequent calculus
for implicational propositional logic, and which defines meaning assignment by using the Curry-
Howard isomorphism between Natural Deduction proofs in implicational propositional logic and
terms of typed lambda calculus. It is shown that no satisfactory account of the junctions and
quantifiers is possible in Lambek categorial grammar.I introduce then a framework that I call Polymorphic Lambek Categorial Grammar, which adds
variables and their universal quantification, to the language of categorisation. The set of syntacÂŹ
tic rules is specified by a notational variant of Gentzen's sequent calculus for quantified proposiÂŹ
tional logic, and which defines meaning assignment by using Girard's Extended Curry-Howard
isomorphism between Natural Deduction proofs in quantified implicational propositional logic
and terms of 2nd order polymorphic lambda calculus. It is shown that this allows an account
of the junctions and quantifiers, and one which is 'emerg
\Sigma\Pi-polycategories, additive linear logic, and process semantics
We present a process semantics for the purely additive fragment of linear
logic in which formulas denote protocols and (equivalence classes of) proofs
denote multi-channel concurrent processes. The polycategorical model induced by
this process semantics is shown to be equivalent to the free polycategory based
on the syntax (i.e., it is full and faithfully complete). This establishes that
the additive fragment of linear logic provides a semantics of concurrent
processes. Another property of this semantics is that it gives a canonical
representation of proofs in additive linear logic.
This arXived version omits Section 1.7.1: "Circuit diagrams for
polycategories" as the Xy-pic diagrams would not compile due to lack of memory.
For a complete version see "http://www.cpsc.ucalgary.ca/~pastroc/".Comment: 175 pages, University of Calgary Master's thesi
Staged Compilation with Two-Level Type Theory
The aim of staged compilation is to enable metaprogramming in a way such that
we have guarantees about the well-formedness of code output, and we can also
mix together object-level and meta-level code in a concise and convenient
manner. In this work, we observe that two-level type theory (2LTT), a system
originally devised for the purpose of developing synthetic homotopy theory,
also serves as a system for staged compilation with dependent types. 2LTT has
numerous good properties for this use case: it has a concise specification,
well-behaved model theory, and it supports a wide range of language features
both at the object and the meta level. First, we give an overview of 2LTT's
features and applications in staging. Then, we present a staging algorithm and
prove its correctness. Our algorithm is "staging-by-evaluation", analogously to
the technique of normalization-by-evaluation, in that staging is given by the
evaluation of 2LTT syntax in a semantic domain. The staging algorithm together
with its correctness constitutes a proof of strong conservativity of 2LLT over
the object theory. To our knowledge, this is the first description of staged
compilation which supports full dependent types and unrestricted staging for
types
- âŠ