283 research outputs found
Cyclic Datatypes modulo Bisimulation based on Second-Order Algebraic Theories
Cyclic data structures, such as cyclic lists, in functional programming are
tricky to handle because of their cyclicity. This paper presents an
investigation of categorical, algebraic, and computational foundations of
cyclic datatypes. Our framework of cyclic datatypes is based on second-order
algebraic theories of Fiore et al., which give a uniform setting for syntax,
types, and computation rules for describing and reasoning about cyclic
datatypes. We extract the "fold" computation rules from the categorical
semantics based on iteration categories of Bloom and Esik. Thereby, the rules
are correct by construction. We prove strong normalisation using the General
Schema criterion for second-order computation rules. Rather than the fixed
point law, we particularly choose Bekic law for computation, which is a key to
obtaining strong normalisation. We also prove the property of "Church-Rosser
modulo bisimulation" for the computation rules. Combining these results, we
have a remarkable decidability result of the equational theory of cyclic data
and fold.Comment: 38 page
Strongly Normalising Cyclic Data Computation by Iteration Categories of Second-Order Algebraic Theories
Cyclic data structures, such as cyclic lists, in functional
programming are tricky to handle because of their cyclicity. This
paper presents an investigation of categorical, algebraic, and
computational foundations of cyclic datatypes. Our framework of
cyclic datatypes is based on second-order algebraic theories of Fiore
et al., which give a uniform setting for syntax, types, and
computation rules for describing and reasoning about cyclic datatypes.
We extract the ``fold\u27\u27 computation rules from the categorical
semantics based on iteration categories of Bloom and Esik. Thereby,
the rules are correct by construction. Finally, we prove strong
normalisation using the General Schema criterion for second-order
computation rules. Rather than the fixed point law, we particularly
choose Bekic law for computation, which is a key to obtaining strong
normalisation
On the Relation of Interaction Semantics to Continuations and Defunctionalization
In game semantics and related approaches to programming language semantics,
programs are modelled by interaction dialogues. Such models have recently been
used in the design of new compilation methods, e.g. for hardware synthesis or
for programming with sublinear space. This paper relates such semantically
motivated non-standard compilation methods to more standard techniques in the
compilation of functional programming languages, namely continuation passing
and defunctionalization. We first show for the linear {\lambda}-calculus that
interpretation in a model of computation by interaction can be described as a
call-by-name CPS-translation followed by a defunctionalization procedure that
takes into account control-flow information. We then establish a relation
between these two compilation methods for the simply-typed {\lambda}-calculus
and end by considering recursion
A type system with usage aspects
Linear typing schemes can be used to guarantee non-interference and so the soundness of in-place update with respect to a functional semantics. But linear schemes are restrictive in practice, and more restrictive than necessary to guarantee soundness of in-place update. This limitation has prompted research into static analysis and more sophisticated typing disciplines to determine when in-place update may be safely used, or to combine linear and non-linear schemes. Here we contribute to this direction by defining a new typing scheme that better approximates the semantic property of soundness of in-place update for a functional semantics. We begin from the observation that some data are used only in a read-only context, after which it may be safely re-used before being destroyed. Formalising the in-place update interpretation in a machine model semantics allows us to refine this observation, motivating three usage aspects apparent from the semantics that are used to annotate function argument types. The aspects are (1) used destructively, (2), used read-only but shared with result, and (3) used read-only and not shared with the result. The main novelty is aspect (2), which allows a linear value to be safely read and even aliased with a result of a function without being consumed. This novelty makes our type system more expressive than previous systems for functional languages in the literature. The system remains simple and intuitive, but it enjoys a strong soundness property whose proof is non-trivial. Moreover, our analysis features principal types and feasible type reconstruction, as shown in M. Konen'y (In TYPES 2002 workshop, Nijmegen, Proceedings, Springer-Verlag, 2003)
The computability path ordering
This paper aims at carrying out termination proofs for simply typed
higher-order calculi automatically by using ordering comparisons. To this end,
we introduce the computability path ordering (CPO), a recursive relation on
terms obtained by lifting a precedence on function symbols. A first version,
core CPO, is essentially obtained from the higher-order recursive path ordering
(HORPO) by eliminating type checks from some recursive calls and by
incorporating the treatment of bound variables as in the com-putability
closure. The well-foundedness proof shows that core CPO captures the essence of
computability arguments \'a la Tait and Girard, therefore explaining its name.
We further show that no further type check can be eliminated from its recursive
calls without loosing well-foundedness, but for one for which we found no
counterexample yet. Two extensions of core CPO are then introduced which allow
one to consider: the first, higher-order inductive types; the second, a
precedence in which some function symbols are smaller than application and
abstraction
A Semantic Information Management Approach for Improving Bridge Maintenance based on Advanced Constraint Management
Bridge rehabilitation projects are important for transportation infrastructures. This research proposes a novel information management approach based on state-of-the-art deep learning models and ontologies. The approach can automatically extract, integrate, complete, and search for project knowledge buried in unstructured text documents. The approach on the one hand facilitates implementation of modern management approaches, i.e., advanced working packaging to delivery success bridge rehabilitation projects, on the other hand improves information management practices in the construction industry
Modular Termination for Second-Order Computation Rules and Application to Algebraic Effect Handlers
We present a new modular proof method of termination for second-order
computation, and report its implementation SOL. The proof method is useful for
proving termination of higher-order foundational calculi. To establish the
method, we use a variation of semantic labelling translation and Blanqui's
General Schema: a syntactic criterion of strong normalisation. As an
application, we apply this method to show termination of a variant of
call-by-push-value calculus with algebraic effects and effect handlers. We also
show that our tool SOL is effective to solve higher-order termination problems.Comment: 27 page
- …