46 research outputs found

    Realising Intensional S4 and GL Modalities

    Get PDF

    Constraining Montague Grammar for computational applications

    Get PDF
    This work develops efficient methods for the implementation of Montague Grammar on a computer. It covers both the syntactic and the semantic aspects of that task. Using a simplified but adequate version of Montague Grammar it is shown how to translate from an English fragment to a purely extensional first-order language which can then be made amenable to standard automatic theorem-proving techniques. Translating a sentence of Montague English into the first-order predicate calculus usually proceeds via an intermediate translation in the typed lambda calculus which is then simplified by lambda-reduction to obtain a first-order equivalent. If sufficient sortal structure underlies the type theory for the reduced translation to always be a first-order one then perhaps it should be directly constructed during the syntactic analysis of the sentence so that the lambda-expressions never come into existence and no further processing is necessary. A method is proposed to achieve this involving the unification of meta-logical expressions which flesh out the type symbols of Montague's type theory with first-order schemas. It is then shown how to implement Montague Semantics without using a theorem prover for type theory. Nothing more than a theorem prover for the first-order predicate calculus is required. The first-order system can be used directly without encoding the whole of type theory. It is only necessary to encode a part of second-order logic and this can be done in an efficient, succinct, and readable manner. Furthermore the pseudo-second-order terms need never appear in any translations provided by the parser. They are vital just when higher-order reasoning must be simulated. The foundation of this approach is its five-sorted theory of Montague Semantics. The objects in this theory are entities, indices, propositions, properties, and quantities. It is a theory which can be expressed in the language of first-order logic by means of axiom schemas and there is a finite second-order axiomatisation which is the basis for the theorem-proving arrangement. It can be viewed as a very constrained set theory

    On Induction, Coinduction and Equality in Martin-L\uf6f and Homotopy Type Theory

    Get PDF
    Martin L\uf6f Type Theory, having put computation at the center of logicalreasoning, has been shown to be an effective foundation for proof assistants,with applications both in computer science and constructive mathematics. Oneambition though is for MLTT to also double as a practical general purposeprogramming language. Datatypes in type theory come with an induction orcoinduction principle which gives a precise and concise specification of theirinterface. However, such principles can interfere with how we would like toexpress our programs. In this thesis, we investigate more flexible alternativesto direct uses of the (co)induction principles.As a first contribution, we consider the n-truncation of a type in Homo-topy Type Theory. We derive in HoTT an eliminator into (n+1)-truncatedtypes instead of n-truncated ones, assuming extra conditions on the underlyingfunction.As a second contribution, we improve on type-based criteria for terminationand productivity. By augmenting the types with well-foundedness information,such criteria allow function definitions in a style closer to general recursion.We consider two criteria: guarded types, and sized types.Guarded types introduce a modality ”later” to guard the availability ofrecursive calls provided by a general fixed-point combinator. In Guarded Cu-bical Type Theory we equip the fixed-point combinator with a propositionalequality to its one-step unfolding, instead of a definitional equality that wouldbreak normalization. The notion of path from Cubical Type Theory allows usto do so without losing canonicity or decidability of conversion.Sized types, on the other hand, explicitly index datatypes with size boundson the height or depth of their elements. The sizes however can get in theway of the reasoning principles we expect. Our approach is to introduce newquantifiers for ”irrelevant” size quantification. We present a type theory withparametric quantifiers where irrelevance arises as a “free theorem”. We alsodevelop a conversion checking algorithm for a more specific theory where thenew quantifiers are restricted to sizes.Finally, our third contribution is about the operational semantics of typetheory. For the extensions above we would like to devise a practical conversionchecking algorithm suitable for integration into a proof assistant. We formal-ized the correctness of such an algorithm for a small but challenging corecalculus, proving that conversion is decidable. We expect this development toform a good basis to verify more complex theories.The ideas discussed in this thesis are already influencing the developmentof Agda, a proof assistant based on type theory

    Partial functions and recursion in univalent type theory

    Get PDF
    We investigate partial functions and computability theory from within a constructive, univalent type theory. The focus is on placing computability into a larger mathematical context, rather than on a complete development of computability theory. We begin with a treatment of partial functions, using the notion of dominance, which is used in synthetic domain theory to discuss classes of partial maps. We relate this and other ideas from synthetic domain theory to other approaches to partiality in type theory. We show that the notion of dominance is difficult to apply in our setting: the set of �0 1 propositions investigated by Rosolini form a dominance precisely if a weak, but nevertheless unprovable, choice principle holds. To get around this problem, we suggest an alternative notion of partial function we call disciplined maps. In the presence of countable choice, this notion coincides with Rosolini’s. Using a general notion of partial function,we take the first steps in constructive computability theory. We do this both with computability as structure, where we have direct access to programs; and with computability as property, where we must work in a program-invariant way. We demonstrate the difference between these two approaches by showing how these approaches relate to facts about computability theory arising from topos-theoretic and typetheoretic concerns. Finally, we tie the two threads together: assuming countable choice and that all total functions N - N are computable (both of which hold in the effective topos), the Rosolini partial functions, the disciplined maps, and the computable partial functions all coincide. We observe, however, that the class of all partial functions includes non-computable partial functions

    A Dependently Typed Language with Nontermination

    Get PDF
    We propose a full-spectrum dependently typed programming language, Zombie, which supports general recursion natively. The Zombie implementation is an elaborating typechecker. We prove type saftey for a large subset of the Zombie core language, including features such as computational irrelevance, CBV-reduction, and propositional equality with a heterogeneous, completely erased elimination form. Zombie does not automatically beta-reduce expressions, but instead uses congruence closure for proof and type inference. We give a specification of a subset of the surface language via a bidirectional type system, which works up-to-congruence, and an algorithm for elaborating expressions in this language to an explicitly typed core language. We prove that our elaboration algorithm is complete with respect to the source type system. Zombie also features an optional termination-checker, allowing nonterminating programs returning proofs as well as external proofs about programs
    corecore