6,300 research outputs found
Implementing Compositional Analysis Using Intersection Types With Expansion Variables
AbstractA program analysis is compositional when the analysis result for a particular program fragment is obtained solely from the results for its immediate subfragments via some composition operator. This means the subfragments can be analyzed independently in any order. Many commonly used program analysis techniques (in particular, most abstract interpretations and most uses of the Hindley/Milner type system) are not compositional and require the entire text of a program for sound and complete analysis.System
I
is a recent type system for the pure Îť-calculus with intersection types and the new technology of expansion variables. System
I
supports compositional analysis because it has the principal typings property and an algorithm based on the new technology of β-unification has been developed that finds these principal typings. In addition, for each natural number k, typability in the rank-k restriction of System
I
is decidable, so a complete and terminating analysis algorithm exists for the rank-k restriction.This paper presents new understanding that has been gained from working with multiple implementations of System
I
and β-unification-based analysis algorithms. The previous literature on System
I
presented the type system in a way that helped in proving its more important theoretical properties, but was not as easy to follow as it could be. This paper provides a presentation of many aspects of System
I
that should be clearer as well as a discussion of important implementation issues
Refinement Types for Logical Frameworks and Their Interpretation as Proof Irrelevance
Refinement types sharpen systems of simple and dependent types by offering
expressive means to more precisely classify well-typed terms. We present a
system of refinement types for LF in the style of recent formulations where
only canonical forms are well-typed. Both the usual LF rules and the rules for
type refinements are bidirectional, leading to a straightforward proof of
decidability of typechecking even in the presence of intersection types.
Because we insist on canonical forms, structural rules for subtyping can now be
derived rather than being assumed as primitive. We illustrate the expressive
power of our system with examples and validate its design by demonstrating a
precise correspondence with traditional presentations of subtyping. Proof
irrelevance provides a mechanism for selectively hiding the identities of terms
in type theories. We show that LF refinement types can be interpreted as
predicates using proof irrelevance, establishing a uniform relationship between
two previously studied concepts in type theory. The interpretation and its
correctness proof are surprisingly complex, lending support to the claim that
refinement types are a fundamental construct rather than just a convenient
surface syntax for certain uses of proof irrelevance
A Computational Model of Creative Design as a Sociocultural Process Involving the Evolution of Language
The aim of this research is to investigate the mechanisms of creative design within the context of an evolving language through computational modelling. Computational Creativity is a subfield of Artificial Intelligence that focuses on modelling creative behaviours. Typically, research in Computational Creativity has treated language as a medium, e.g., poetry, rather than an active component of the creative process. Previous research studying the role of language in creative design has relied on interviewing human participants, limiting opportunities for computational modelling. This thesis explores the potential for language to play an active role in computational creativity by connecting computational models of the evolution of artificial languages and creative design processes. Multi-agent simulations based on the Domain-Individual-Field-Interaction framework are employed to evolve artificial languages with features that may support creative designing including ambiguity, incongruity, exaggeration and elaboration. The simulation process consists of three steps: (1) constructing representations associating topics, meanings and utterances; (2) structured communication of utterances and meanings through the playing of âlanguage gamesâ; and (3) evaluation of design briefs and works. The use of individual agents with different evaluation criteria, preferences and roles enriches the scope and diversity of the simulations. The results of the experiments conducted with artificial creative language systems demonstrate the expansion of design spaces by generating compositional utterances representing novel concepts among design agents using language features and weighted context free grammars. They can be used to computationally explore the roles of language in creative design, and possibly point to computational applications. Understanding the evolution of artificial languages may provide insights into human languages, especially those features that support creativity
Reasoning & Querying â State of the Art
Various query languages for Web and Semantic Web data, both for practical use and as an area of research in the scientific community, have emerged in recent years. At the same time, the broad adoption of the internet where keyword search is used in many applications, e.g. search engines, has familiarized casual users with using keyword queries to retrieve information on the internet. Unlike this easy-to-use querying, traditional query languages require knowledge of the language itself as well as of the data to be queried. Keyword-based query languages for XML and RDF bridge the gap between the two, aiming at enabling simple querying of semi-structured data, which is relevant e.g. in the context of the emerging Semantic Web. This article presents an overview of the field of keyword querying for XML and RDF
A Theory of Explicit Substitutions with Safe and Full Composition
Many different systems with explicit substitutions have been proposed to
implement a large class of higher-order languages. Motivations and challenges
that guided the development of such calculi in functional frameworks are
surveyed in the first part of this paper. Then, very simple technology in named
variable-style notation is used to establish a theory of explicit substitutions
for the lambda-calculus which enjoys a whole set of useful properties such as
full composition, simulation of one-step beta-reduction, preservation of
beta-strong normalisation, strong normalisation of typed terms and confluence
on metaterms. Normalisation of related calculi is also discussed.Comment: 29 pages Special Issue: Selected Papers of the Conference
"International Colloquium on Automata, Languages and Programming 2008" edited
by Giuseppe Castagna and Igor Walukiewic
The Parma Polyhedra Library: Toward a Complete Set of Numerical Abstractions for the Analysis and Verification of Hardware and Software Systems
Since its inception as a student project in 2001, initially just for the
handling (as the name implies) of convex polyhedra, the Parma Polyhedra Library
has been continuously improved and extended by joining scrupulous research on
the theoretical foundations of (possibly non-convex) numerical abstractions to
a total adherence to the best available practices in software development. Even
though it is still not fully mature and functionally complete, the Parma
Polyhedra Library already offers a combination of functionality, reliability,
usability and performance that is not matched by similar, freely available
libraries. In this paper, we present the main features of the current version
of the library, emphasizing those that distinguish it from other similar
libraries and those that are important for applications in the field of
analysis and verification of hardware and software systems.Comment: 38 pages, 2 figures, 3 listings, 3 table
- âŚ