59 research outputs found
Optimizing Abstract Abstract Machines
The technique of abstracting abstract machines (AAM) provides a systematic
approach for deriving computable approximations of evaluators that are easily
proved sound. This article contributes a complementary step-by-step process for
subsequently going from a naive analyzer derived under the AAM approach, to an
efficient and correct implementation. The end result of the process is a two to
three order-of-magnitude improvement over the systematically derived analyzer,
making it competitive with hand-optimized implementations that compute
fundamentally less precise results.Comment: Proceedings of the International Conference on Functional Programming
2013 (ICFP 2013). Boston, Massachusetts. September, 201
An intensional implementation technique for functional languages
The potential of functional programming languages has not been
widely accepted yet. The reason lies in the difficulties associated with
their implementation. In this dissertation we propose a new
implementation technique for functional languages by compiling them
into 'Intensional Logic' of R. Montague and R. Carnap. Our technique is
not limited to a particular hardware or to a particular evaluation
strategy; nevertheless it lends itself directly to demand-driven tagged
dataflow architecture. Even though our technique can handle
conventional languages as well, our main interest is exclusively with
functional languages in general and with Lucid-like dataflow languages
in particular.
We give a brief general account of intensional logic and then
introduce the concept of intensional algebras as structures (models) for
intensional logic. We, formally, show the computability requirements for
such algebras.
The target language of our compilation is the family of languages
DE (definitional equations over intensional expressions). A program in
DE is a linear (not structured) set of non-ambiguous equations defining
nullary variable symbols. One of these variable symbols should be the
symbol result.
We introduce the compilation of Iswim (a first order variant of
Landin's ISWIM) as an example of compiling functions into intensional
expressions. A compilation algorithm is given. Iswim(A), for any algebra
of data types A, is compiled into DE(Flo(A)) where Flo(A) is a uniquely
defined intensional algebra over the tree of function calls. The approach
is extended to compiling Luswim and Lucid.
We describe the demand-driven tagged dataflow (the eduction)
approach to evaluating the intensional family of target languages DE.
Furthermore, for each intensional algebra, we introduce a collection of
rewrite rules. A justification of correctness is given. These rules are the
basis for evaluating programs in the target DE by reduction.
Finally, we discuss possible refinements and extensions to our
approach
Integrating Lucid's Declarative Dataflow Paradigm into Object-Orientation
The dataflow language Lucid applies concepts from intensional logic to declarative ISWIM expressions which are intensionalised relative to the dimension of time, thus introducing the notion of an expression’s history. Lucian, a language derived from Lucid, embeds dataflow into object-orientation allowing the intensionalisation of objects. Lucian introduces the notion of a declarative intensional object as the history of an object’s transformations. This paper discusses the embedding relationships and semantics of conjoining the dataflow and object-oriented paradigms to provide the language Lucian for defining intensional objects
Mathematics Subject Classification (2000).
68N15 68N19 68Q5
Some History of Functional Programming Languages
We study a series of milestones leading to the emergence of lazy, higher order, polymorphically typed, purely functional programming languages. An invited lecture given at TFP12, St Andrews University, 12 June 2012
A Study of Syntactic and Semantic Artifacts and its Application to Lambda Definability, Strong Normalization, and Weak Normalization in the Presence of...
Church's lambda-calculus underlies the syntax (i.e., the form) and the semantics (i.e., the meaning) of functional programs. This thesis is dedicated to studying man-made constructs (i.e., artifacts) in the lambda calculus. For example, one puts the expressive power of the lambda calculus to the test in the area of lambda definability. In this area, we present a course-of-value representation bridging Church numerals and Scott numerals. We then turn to weak and strong normalization using Danvy et al.'s syntactic and functional correspondences. We give a new account of Felleisen and Hieb's syntactic theory of state, and of abstract machines for strong normalization due to Curien, Crégut, Lescanne, and Kluge
Proofs are Programs: 19th Century Logic and 21st Century Computing
As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with as many twists and turns as a thriller. At the end of the story
is a new principle for designing programming languages that will guide computers into the 21st century.
For my money, Gentzen’s natural deduction and Church’s lambda calculus are on a par with Einstein’s relativity and Dirac’s quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I’ll need a few symbols, but not too many, and I’ll explain as I go along.
To simplify, I’ll present the story as we understand it now, with some asides to fill in the history. First, I’ll introduce Gentzen’s natural deduction, a formalism for proofs. Next, I’ll introduce Church’s lambda calculus, a formalism for programs. Then I’ll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program. Finally, I’ll conclude with a look at how these principles are being applied to design a new generation of programming languages, particularly mobile code for the Internet
A Rational Deconstruction of Landin's SECD Machine with the J Operator
Landin's SECD machine was the first abstract machine for applicative
expressions, i.e., functional programs. Landin's J operator was the first
control operator for functional languages, and was specified by an extension of
the SECD machine. We present a family of evaluation functions corresponding to
this extension of the SECD machine, using a series of elementary
transformations (transformation into continu-ation-passing style (CPS) and
defunctionalization, chiefly) and their left inverses (transformation into
direct style and refunctionalization). To this end, we modernize the SECD
machine into a bisimilar one that operates in lockstep with the original one
but that (1) does not use a data stack and (2) uses the caller-save rather than
the callee-save convention for environments. We also identify that the dump
component of the SECD machine is managed in a callee-save way. The caller-save
counterpart of the modernized SECD machine precisely corresponds to Thielecke's
double-barrelled continuations and to Felleisen's encoding of J in terms of
call/cc. We then variously characterize the J operator in terms of CPS and in
terms of delimited-control operators in the CPS hierarchy. As a byproduct, we
also present several reduction semantics for applicative expressions with the J
operator, based on Curien's original calculus of explicit substitutions. These
reduction semantics mechanically correspond to the modernized versions of the
SECD machine and to the best of our knowledge, they provide the first syntactic
theories of applicative expressions with the J operator
- …