28,180 research outputs found

    Dialectica Categories for the Lambek Calculus

    Full text link
    We revisit the old work of de Paiva on the models of the Lambek Calculus in dialectica models making sure that the syntactic details that were sketchy on the first version got completed and verified. We extend the Lambek Calculus with a \kappa modality, inspired by Yetter's work, which makes the calculus commutative. Then we add the of-course modality !, as Girard did, to re-introduce weakening and contraction for all formulas and get back the full power of intuitionistic and classical logic. We also present the categorical semantics, proved sound and complete. Finally we show the traditional properties of type systems, like subject reduction, the Church-Rosser theorem and normalization for the calculi of extended modalities, which we did not have before

    A Typed Model for Linked Data

    No full text
    The term Linked Data is used to describe ubiquitous and emerging semi-structured data formats on the Web. URIs in Linked Data allow diverse data sources to link to each other, forming a Web of Data. A calculus which models concurrent queries and updates over Linked Data is presented. The calculus exhibits operations essential for declaring rich atomic actions. The operations recover emergent structure in the loosely structured Web of Data. The calculus is executable due to its operational semantics. A light type system ensures that URIs with a distinguished role are used consistently. The main theorem verifies that the light type system and operational semantics work at the same level of granularity, so are compatible. Examples show that a range of existing and emerging standards are captured. Data formats include RDF, named graphs and feeds. The primitives of the calculus model SPARQL Query and the Atom Publishing Protocol. The subtype system is based on RDFS, which improves interoperability. Examples focuss on the SPARQL Update proposal for which a fine grained operational semantics is developed. Further potential high level languages are outlined for exploiting Linked Data

    Unifying type systems for mobile processes

    Full text link
    We present a unifying framework for type systems for process calculi. The core of the system provides an accurate correspondence between essentially functional processes and linear logic proofs; fragments of this system correspond to previously known connections between proofs and processes. We show how the addition of extra logical axioms can widen the class of typeable processes in exchange for the loss of some computational properties like lock-freeness or termination, allowing us to see various well studied systems (like i/o types, linearity, control) as instances of a general pattern. This suggests unified methods for extending existing type systems with new features while staying in a well structured environment and constitutes a step towards the study of denotational semantics of processes using proof-theoretical methods

    Combinatorial structure of type dependency

    Full text link
    We give an account of the basic combinatorial structure underlying the notion of type dependency. We do so by considering the category of all dependent sequent calculi, and exhibiting it as the category of algebras for a monad on a presheaf category. The objects of the presheaf category encode the basic judgements of a dependent sequent calculus, while the action of the monad encodes the deduction rules; so by giving an explicit description of the monad, we obtain an explicit account of the combinatorics of type dependency. We find that this combinatorics is controlled by a particular kind of decorated ordered tree, familiar from computer science and from innocent game semantics. Furthermore, we find that the monad at issue is of a particularly well-behaved kind: it is local right adjoint in the sense of Street--Weber. In future work, we will use this fact to describe nerves for dependent type theories, and to study the coherence problem for dependent type theory using the tools of two-dimensional monad theory.Comment: 35 page

    Infinitary λ\lambda-Calculi from a Linear Perspective (Long Version)

    Get PDF
    We introduce a linear infinitary λ\lambda-calculus, called ℓΛ∞\ell\Lambda_{\infty}, in which two exponential modalities are available, the first one being the usual, finitary one, the other being the only construct interpreted coinductively. The obtained calculus embeds the infinitary applicative λ\lambda-calculus and is universal for computations over infinite strings. What is particularly interesting about ℓΛ∞\ell\Lambda_{\infty}, is that the refinement induced by linear logic allows to restrict both modalities so as to get calculi which are terminating inductively and productive coinductively. We exemplify this idea by analysing a fragment of ℓΛ\ell\Lambda built around the principles of SLL\mathsf{SLL} and 4LL\mathsf{4LL}. Interestingly, it enjoys confluence, contrarily to what happens in ordinary infinitary λ\lambda-calculi

    A type system for PSPACE derived from light linear logic

    Full text link
    We present a polymorphic type system for lambda calculus ensuring that well-typed programs can be executed in polynomial space: dual light affine logic with booleans (DLALB). To build DLALB we start from DLAL (which has a simple type language with a linear and an intuitionistic type arrow, as well as one modality) which characterizes FPTIME functions. In order to extend its expressiveness we add two boolean constants and a conditional constructor in the same way as with the system STAB. We show that the value of a well-typed term can be computed by an alternating machine in polynomial time, thus such a term represents a program of PSPACE (given that PSPACE = APTIME). We also prove that all polynomial space decision functions can be represented in DLALB. Therefore DLALB characterizes PSPACE predicates.Comment: In Proceedings DICE 2011, arXiv:1201.034

    Light types for polynomial time computation in lambda-calculus

    Full text link
    We propose a new type system for lambda-calculus ensuring that well-typed programs can be executed in polynomial time: Dual light affine logic (DLAL). DLAL has a simple type language with a linear and an intuitionistic type arrow, and one modality. It corresponds to a fragment of Light affine logic (LAL). We show that contrarily to LAL, DLAL ensures good properties on lambda-terms: subject reduction is satisfied and a well-typed term admits a polynomial bound on the reduction by any strategy. We establish that as LAL, DLAL allows to represent all polytime functions. Finally we give a type inference procedure for propositional DLAL.Comment: 20 pages (including 10 pages of appendix). (revised version; in particular section 5 has been modified). A short version is to appear in the proceedings of the conference LICS 2004 (IEEE Computer Society Press

    Non-normal modalities in variants of Linear Logic

    Get PDF
    This article presents modal versions of resource-conscious logics. We concentrate on extensions of variants of Linear Logic with one minimal non-normal modality. In earlier work, where we investigated agency in multi-agent systems, we have shown that the results scale up to logics with multiple non-minimal modalities. Here, we start with the language of propositional intuitionistic Linear Logic without the additive disjunction, to which we add a modality. We provide an interpretation of this language on a class of Kripke resource models extended with a neighbourhood function: modal Kripke resource models. We propose a Hilbert-style axiomatization and a Gentzen-style sequent calculus. We show that the proof theories are sound and complete with respect to the class of modal Kripke resource models. We show that the sequent calculus admits cut elimination and that proof-search is in PSPACE. We then show how to extend the results when non-commutative connectives are added to the language. Finally, we put the logical framework to use by instantiating it as logics of agency. In particular, we propose a logic to reason about the resource-sensitive use of artefacts and illustrate it with a variety of examples

    The Structure of Differential Invariants and Differential Cut Elimination

    Full text link
    The biggest challenge in hybrid systems verification is the handling of differential equations. Because computable closed-form solutions only exist for very simple differential equations, proof certificates have been proposed for more scalable verification. Search procedures for these proof certificates are still rather ad-hoc, though, because the problem structure is only understood poorly. We investigate differential invariants, which define an induction principle for differential equations and which can be checked for invariance along a differential equation just by using their differential structure, without having to solve them. We study the structural properties of differential invariants. To analyze trade-offs for proof search complexity, we identify more than a dozen relations between several classes of differential invariants and compare their deductive power. As our main results, we analyze the deductive power of differential cuts and the deductive power of differential invariants with auxiliary differential variables. We refute the differential cut elimination hypothesis and show that, unlike standard cuts, differential cuts are fundamental proof principles that strictly increase the deductive power. We also prove that the deductive power increases further when adding auxiliary differential variables to the dynamics
    • 

    corecore