95 research outputs found
A Transformation-Based Foundation for Semantics-Directed Code Generation
Interpreters and compilers are two different ways of implementing
programming languages. An interpreter directly executes its program
input. It is a concise definition of the semantics of a programming
language and is easily implemented. A compiler translates its program
input into another language. It is more difficult to construct, but
the code that it generates runs faster than interpreted code.
In this dissertation, we propose a transformation-based foundation for
deriving compilers from semantic specifications in the form of four
rules. These rules give apriori advice for staging, and allow
explicit compiler derivation that would be less succinct with partial
evaluation. When applied, these rules turn an interpreter that
directly executes its program input into a compiler that emits the
code that the interpreter would have executed.
We formalize the language syntax and semantics to be used for the
interpreter and the compiler, and also specify a notion of equality.
It is then possible to precisely state the transformation rules and to
prove both local and global correctness theorems. And although the
transformation rules were developed so as to apply to an interpreter
written in a denotational style, we consider how to modify
non-denotational interpreters so that the rules apply. Finally, we
illustrate these ideas by considering a larger example: a Prolog
implementation
A Study of Syntactic and Semantic Artifacts and its Application to Lambda Definability, Strong Normalization, and Weak Normalization in the Presence of...
Church's lambda-calculus underlies the syntax (i.e., the form) and the semantics (i.e., the meaning) of functional programs. This thesis is dedicated to studying man-made constructs (i.e., artifacts) in the lambda calculus. For example, one puts the expressive power of the lambda calculus to the test in the area of lambda definability. In this area, we present a course-of-value representation bridging Church numerals and Scott numerals. We then turn to weak and strong normalization using Danvy et al.'s syntactic and functional correspondences. We give a new account of Felleisen and Hieb's syntactic theory of state, and of abstract machines for strong normalization due to Curien, Crégut, Lescanne, and Kluge
On Applicative Similarity, Sequentiality, and Full Abstraction
International audienceWe study how applicative bisimilarity behaves when instantiated on a call-by-value probabilistic λ-calculus, endowed with Plotkin's parallel disjunction operator. We prove that congruence and coincidence with the corresponding context relation hold for both bisimilarity and similarity, the latter known to be impossible in sequential languages
On Probabilistic Applicative Bisimulation and Call-by-Value -Calculi (Long Version)
Probabilistic applicative bisimulation is a recently introduced coinductive
methodology for program equivalence in a probabilistic, higher-order, setting.
In this paper, the technique is applied to a typed, call-by-value,
lambda-calculus. Surprisingly, the obtained relation coincides with context
equivalence, contrary to what happens when call-by-name evaluation is
considered. Even more surprisingly, full-abstraction only holds in a symmetric
setting.Comment: 30 page
A Strong Distillery
Abstract machines for the strong evaluation of lambda-terms (that is, under
abstractions) are a mostly neglected topic, despite their use in the
implementation of proof assistants and higher-order logic programming
languages. This paper introduces a machine for the simplest form of strong
evaluation, leftmost-outermost (call-by-name) evaluation to normal form,
proving it correct, complete, and bounding its overhead. Such a machine, deemed
Strong Milner Abstract Machine, is a variant of the KAM computing normal forms
and using just one global environment. Its properties are studied via a special
form of decoding, called a distillation, into the Linear Substitution Calculus,
neatly reformulating the machine as a standard micro-step strategy for explicit
substitutions, namely linear leftmost-outermost reduction, i.e., the extension
to normal form of linear head reduction. Additionally, the overhead of the
machine is shown to be linear both in the number of steps and in the size of
the initial term, validating its design. The study highlights two distinguished
features of strong machines, namely backtracking phases and their interactions
with abstractions and environments.Comment: Accepted at APLAS 201
- …