216 research outputs found

    Principal Typings in a Restricted Intersection Type System for Beta Normal Forms with De Bruijn Indices

    Full text link
    The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it

    Safe Compositional Specification of Networking Systems: A Compositional Analysis Approach

    Full text link
    We present a type inference algorithm, in the style of compositional analysis, for the language TRAFFICā€”a specification language for flow composition applications proposed in [2]ā€”and prove that this algorithm is correct: the typings it infers are principal typings, and the typings agree with syntax-directed type checking on closed flow specifications. This algorithm is capable of verifying partial flow specifications, which is a significant improvement over syntax-directed type checking algorithm presented in [3]. We also show that this algorithm runs efficiently, i.e., in low-degree polynomial time.National Science Foundation (ITR ANI-0205294, ANI-0095988, ANI-9986397, EIA-0202067

    Proceedings of International Workshop "Global Computing: Programming Environments, Languages, Security and Analysis of Systems"

    Get PDF
    According to the IST/ FET proactive initiative on GLOBAL COMPUTING, the goal is to obtain techniques (models, frameworks, methods, algorithms) for constructing systems that are flexible, dependable, secure, robust and efficient. The dominant concerns are not those of representing and manipulating data efficiently but rather those of handling the co-ordination and interaction, security, reliability, robustness, failure modes, and control of risk of the entities in the system and the overall design, description and performance of the system itself. Completely different paradigms of computer science may have to be developed to tackle these issues effectively. The research should concentrate on systems having the following characteristics: ā€¢ The systems are composed of autonomous computational entities where activity is not centrally controlled, either because global control is impossible or impractical, or because the entities are created or controlled by different owners. ā€¢ The computational entities are mobile, due to the movement of the physical platforms or by movement of the entity from one platform to another. ā€¢ The configuration varies over time. For instance, the system is open to the introduction of new computational entities and likewise their deletion. The behaviour of the entities may vary over time. ā€¢ The systems operate with incomplete information about the environment. For instance, information becomes rapidly out of date and mobility requires information about the environment to be discovered. The ultimate goal of the research action is to provide a solid scientific foundation for the design of such systems, and to lay the groundwork for achieving effective principles for building and analysing such systems. This workshop covers the aspects related to languages and programming environments as well as analysis of systems and resources involving 9 projects (AGILE , DART, DEGAS , MIKADO, MRG, MYTHS, PEPITO, PROFUNDIS, SECURE) out of the 13 founded under the initiative. After an year from the start of the projects, the goal of the workshop is to fix the state of the art on the topics covered by the two clusters related to programming environments and analysis of systems as well as to devise strategies and new ideas to profitably continue the research effort towards the overall objective of the initiative. We acknowledge the Dipartimento di Informatica and Tlc of the University of Trento, the Comune di Rovereto, the project DEGAS for partially funding the event and the Events and Meetings Office of the University of Trento for the valuable collaboration

    Expressiveness of Generic Process Shape Types

    Full text link
    Shape types are a general concept of process types which work for many process calculi. We extend the previously published Poly* system of shape types to support name restriction. We evaluate the expressiveness of the extended system by showing that shape types are more expressive than an implicitly typed pi-calculus and an explicitly typed Mobile Ambients. We demonstrate that the extended system makes it easier to enjoy advantages of shape types which include polymorphism, principal typings, and a type inference implementation.Comment: Submitted to Trustworthy Global Computing (TGC) 2010

    Class Hierarchy Complementation: Soundly Completing a Partial Type Graph

    Get PDF
    We present the problem of class hierarchy complementa- tion: given a partially known hierarchy of classes together with subtyping constraints (ā€œA has to be a transitive sub- type of Bā€) complete the hierarchy so that it satisfies all con- straints. The problem has immediate practical application to the analysis of partial programsā€”e.g., it arises in the process of providing a sound handling of ā€œphantom classesā€ in the Soot program analysis framework. We provide algorithms to solve the hierarchy complementation problem in the single inheritance and multiple inheritance settings. We also show that the problem in a language such as Java, with single in- heritance but multiple subtyping and distinguished class vs. interface types, can be decomposed into separate single- and multiple-subtyping instances. We implement our algorithms in a tool, JPhantom, which complements partial Java byte- code programs so that the result is guaranteed to satisfy the Java verifier requirements. JPhantom is highly scalable and runs in mere seconds even for large input applications and complex constraints (with a maximum of 14s for a 19MB binary)

    Principal typings for interactive ruby programming

    Get PDF
    A novel and promising method of software development is the interactive style of development, where code is written and incrementally tested simultaneously. Interpreted dynamic languages such as Ruby, Python, and Lua support this interactive development style. However, because they lack semantic analysis as part of a compilation phase, they do not provide type-checking. The programmer is only informed of type errors when they are encountered in the execution of the programā€“far too late and often at a less-informative location in the code. We introduce a typing system for Ruby, where types will be determined before execution by inferring principal typings. This system overcomes the obstacles that interactive and dynamic program development imposes on type checking; yielding an effective type-checking facility for dynamic programming languages. Our development is embodied as an extension to irb, the Ruby interactive mode, allowing us to evaluate principal typings for interactive development

    A Simple Semantics for ML Polymorphism

    Get PDF
    We give a framework for denotational semantics for the polymorphic core of the programming language ML. This framework requires no more semantic material than what is needed for modeling the simple type discipline. In our view, the terms of ML are pairs consisting of a raw (untyped) lambda term and a type-scheme that ML\u27s type inference system can derive for the raw term. We interpret type-schemes as sets of simple types. Then, given any model M of the simply typed lambda calculus, the meaning of an ML term will be a set of pairs, each consisting of a simple type Ļ„ and an element of M of type Ļ„. Hence, there is no need to interpret all raw terms, as was done in Milner\u27s original semantic framework. In comparison to Mitchell and Harper\u27s analysis, we avoid having to provide a very large type universe in which generic type-schemes are interpreted. Also, we show how to give meaning to ML terms rather than to derivations in the ML type inference system (which can be several for the same term). We give an axiomatization for the equational theory that corresponds to our semantic framework and prove the analogs of the compeleteness theorems that Friedman proved for the simply typed lambda calculus. The framework can be extended to languages with constants, type constructors and recursive types (via regular trees). For the extended language, we prove a theorem that allows the transfer of certain full abstraction results from languages based on the typed lambda calculus to ML-like languages

    Implementing Compositional Analysis Using Intersection Types With Expansion Variables

    Get PDF
    AbstractA program analysis is compositional when the analysis result for a particular program fragment is obtained solely from the results for its immediate subfragments via some composition operator. This means the subfragments can be analyzed independently in any order. Many commonly used program analysis techniques (in particular, most abstract interpretations and most uses of the Hindley/Milner type system) are not compositional and require the entire text of a program for sound and complete analysis.System I is a recent type system for the pure Ī»-calculus with intersection types and the new technology of expansion variables. System I supports compositional analysis because it has the principal typings property and an algorithm based on the new technology of Ī²-unification has been developed that finds these principal typings. In addition, for each natural number k, typability in the rank-k restriction of System I is decidable, so a complete and terminating analysis algorithm exists for the rank-k restriction.This paper presents new understanding that has been gained from working with multiple implementations of System I and Ī²-unification-based analysis algorithms. The previous literature on System I presented the type system in a way that helped in proving its more important theoretical properties, but was not as easy to follow as it could be. This paper provides a presentation of many aspects of System I that should be clearer as well as a discussion of important implementation issues
    • ā€¦
    corecore