62 research outputs found
Erasure in dependently typed programming
It is important to reduce the cost of correctness in programming. Dependent types
and related techniques, such as type-driven programming, offer ways to do so.
Some parts of dependently typed programs constitute evidence of their typecorrectness
and, once checked, are unnecessary for execution. These parts can easily
become asymptotically larger than the remaining runtime-useful computation, which
can cause linear-time algorithms run in exponential time, or worse. It would be
unnacceptable, and contradict our goal of reducing the cost of correctness, to make
programs run slower by only describing them more precisely.
Current systems cannot erase such computation satisfactorily. By modelling
erasure indirectly through type universes or irrelevance, they impose the limitations
of these means to erasure. Some useless computation then cannot be erased and
idiomatic programs remain asymptotically sub-optimal.
This dissertation explains why we need erasure, that it is different from other
concepts like irrelevance, and proposes two ways of erasing non-computational data.
One is an untyped flow-based useless variable elimination, adapted for dependently
typed languages, currently implemented in the Idris 1 compiler.
The other is the main contribution of the dissertation: a dependently typed core
calculus with erasure annotations, full dependent pattern matching, and an algorithm
that infers erasure annotations from unannotated (or partially annotated) programs.
I show that erasure in well-typed programs is sound in that it commutes with
single-step reduction. Assuming the Church-Rosser property of reduction, I show
that properties such as Subject Reduction hold, which extends the soundness result
to multi-step reduction. I also show that the presented erasure inference is sound
and complete with respect to the typing rules; that this approach can be extended
with various forms of erasure polymorphism; that it works well with monadic I/O
and foreign functions; and that it is effective in that it not only removes the runtime
overhead caused by dependent typing in the presented examples, but can also shorten
compilation times."This work was supported by the University of St Andrews (School of Computer
Science)." -- Acknowledgement
Targeted Static Analysis for OCaml C Stubs: eliminating gremlins from the code
Migration to OCaml 5 requires updating a lot of C bindings due to the removal
of naked pointer support. Writing OCaml user-defined primitives in C is a
necessity, but is unsafe and error-prone. It does not benefit from either
OCaml's or C's type checking, and existing C static analysers are not aware of
the OCaml GC safety rules, and cannot infer them from existing macros alone.The
alternative is automatically generating C stubs, which requires correctly
managing value lifetimes. Having a static analyser for OCaml to C interfaces is
useful outside the OCaml 5 porting effort too.
After some motivating examples of real bugs in C bindings a static analyser
is presented that finds these known classes of bugs. The tool works on the
OCaml abstract parse and typed trees, and generates a header file and a caller
model. Together with a simplified model of the OCaml runtime this is used as
input to a static analysis framework, Goblint. An analysis is developed that
tracks dereferences of OCaml values, and together with the existing framework
reports incorrect dereferences. An example is shown how to extend the analysis
to cover more safety properties.
The tools and runtime models are generic and could be reused with other
static analysis tools.Comment: submitted to the OCaml 2023 workshop added references about
OCaml/Rust interop and XenServer origin
LIPIcs, Volume 261, ICALP 2023, Complete Volume
LIPIcs, Volume 261, ICALP 2023, Complete Volum
flap: A Deterministic Parser with Fused Lexing
Lexers and parsers are typically defined separately and connected by a token
stream. This separate definition is important for modularity and reduces the
potential for parsing ambiguity. However, materializing tokens as data
structures and case-switching on tokens comes with a cost. We show how to fuse
separately-defined lexers and parsers, drastically improving performance
without compromising modularity or increasing ambiguity. We propose a
deterministic variant of Greibach Normal Form that ensures deterministic
parsing with a single token of lookahead and makes fusion strikingly simple,
and prove that normalizing context free expressions into the deterministic
normal form is semantics-preserving. Our staged parser combinator library,
flap, provides a standard interface, but generates specialized token-free code
that runs two to six times faster than ocamlyacc on a range of benchmarks.Comment: PLDI 2023 with appendi
Affine Disjunctive Invariant Generation with Farkas' Lemma
Invariant generation is the classical problem that aims at automated
generation of assertions that over-approximates the set of reachable program
states in a program. We consider the problem of generating affine invariants
over affine while loops (i.e., loops with affine loop guards, conditional
branches and assignment statements), and explore the automated generation of
disjunctive affine invariants. Disjunctive invariants are an important class of
invariants that capture disjunctive features in programs such as multiple
phases, transitions between different modes, etc., and are typically more
precise than conjunctive invariants over programs with these features. To
generate tight affine invariants, existing constraint-solving approaches have
investigated the application of Farkas' Lemma to conjunctive affine invariant
generation, but none of them considers disjunctive affine invariants
33èmes Journées Francophones des Langages Applicatifs
International audienceLes 33èmes Journées Francophones des Langages Applicatifs (JFLA) se sont tenues à Saint-Médard-d'Excideuil, plus précisément Domaine d'Essendiéras (Périgord), du mardi 28 juin 2022 au vendredi 1er juillet 2022.Les JFLA réunissent concepteurs, utilisateurs et théoriciens ; elles ont pour ambition de couvrir les domaines des langages applicatifs, de la preuve formelle, de la vérification de programmes, et des objets mathématiques qui sous-tendent ces outils. Ces domaines doivent être pris au sens large : nous souhaitons promouvoir les ponts entre les différentes thématiques.- Langages fonctionnels et applicatifs : sémantique, compilation, optimisation, typage, mesures, extensions par d'autres paradigmes.- Assistants de preuve : implémentation, nouvelles tactiques, développements présentant un intérêt technique ou méthodologique.- Logique, correspondance de Curry-Howard, réalisabilité, extraction de programmes, modèles.- Spécification, prototypage, développements formels d'algorithmes.- Vérification de programmes ou de modèles, méthode déductive, interprétation abstraite, raffinement.- Utilisation industrielle des langages fonctionnels et applicatifs, ou des méthodes issues des preuves formelles, outils pour le web.Les articles soumis aux JFLA sont relus par au moins deux personnes s'ils sont acceptés, trois personnes s'ils sont rejetés. Les critiques des relecteurs sont toujours bienveillantes et la plupart du temps encourageantes et constructives, même en cas de rejet
Tools and Algorithms for the Construction and Analysis of Systems
This open access book constitutes the proceedings of the 28th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2022, which was held during April 2-7, 2022, in Munich, Germany, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2022. The 46 full papers and 4 short papers presented in this volume were carefully reviewed and selected from 159 submissions. The proceedings also contain 16 tool papers of the affiliated competition SV-Comp and 1 paper consisting of the competition report. TACAS is a forum for researchers, developers, and users interested in rigorously based tools and algorithms for the construction and analysis of systems. The conference aims to bridge the gaps between different communities with this common interest and to support them in their quest to improve the utility, reliability, exibility, and efficiency of tools and algorithms for building computer-controlled systems
Proceedings of the 22nd Conference on Formal Methods in Computer-Aided Design – FMCAD 2022
The Conference on Formal Methods in Computer-Aided Design (FMCAD) is an annual conference on the theory and applications of formal methods in hardware and system verification. FMCAD provides a leading forum to researchers in academia and industry for presenting and discussing groundbreaking methods, technologies, theoretical results, and tools for reasoning formally about computing systems. FMCAD covers formal aspects of computer-aided system design including verification, specification, synthesis, and testing
Mechanized Reasoning About how Using Functional Programs And Embeddings
Embedding describes the process of encoding a program\u27s syntax and/or semantics in another language---typically a theorem prover in the context of mechanized reasoning. Among different embedding styles, deep embeddings are generally preferred as they enable the most faithful modeling of the original language. However, deep embeddings are also the most complex, and working with them requires additional effort. In light of that, this dissertation aims to draw more attention to alternative styles, namely shallow and mixed embeddings, by studying their use in mechanized reasoning about programs\u27 properties that are related to how . More specifically, I present a simple shallow embedding for reasoning about computation costs of lazy programs, and a class of mixed embeddings that are useful for reasoning about properties of general computation patterns in effectful programs. I show the usefulness of these embedding styles with examples based on real-world applications
- …