3,218 research outputs found

    A survey of program transformation with special reference to unfold/fold style program development

    No full text
    This paper consists of a survey of current, and past, work on *program transformation* for the purpose of optimization. We first discuss some of the general methodological frameworks for program modification, such as *analogy*, *explanation based learning*, *partial evaluation*, *proof theoretic optimization*, and the *unfold/fold* technique. These frameworks are not mutually exclusive, and the latter, unfold/fold, is certainly the most widely used technique, in various guises, for program transformation. Thus we shall often have occasion to: compare the relative merits of systems that employ the technique in some form, *and*; compare the unfold/fold systems with those that employ alternative techniques. We also include (and compare with unfold/fold) a brief survey of recent work concerning the use of *formal methods* for program transformation

    Functional Big-step Semantics

    Get PDF
    When doing an interactive proof about a piece of software, it is important that the underlying programming language’s semantics does not make the proof unnecessarily difficult or unwieldy. Both smallstep and big-step semantics are commonly used, and the latter is typically given by an inductively defined relation. In this paper, we consider an alternative: using a recursive function akin to an interpreter for the language. The advantages include a better induction theorem, less duplication, accessibility to ordinary functional programmers, and the ease of doing symbolic simulation in proofs via rewriting. We believe that this style of semantics is well suited for compiler verification, including proofs of divergence preservation. We do not claim the invention of this style of semantics: our contribution here is to clarify its value, and to explain how it supports several language features that might appear to require a relational or small-step approach. We illustrate the technique on a simple imperative language with C-like for-loops and a break statement, and compare it to a variety of other approaches. We also provide ML and lambda-calculus based examples to illustrate its generality

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Fair Testing

    Get PDF
    In this paper we present a solution to the long-standing problem of characterising the coarsest liveness-preserving pre-congruence with respect to a full (TCSP-inspired) process algebra. In fact, we present two distinct characterisations, which give rise to the same relation: an operational one based on a De Nicola-Hennessy-like testing modality which we call should-testing, and a denotational one based on a refined notion of failures. One of the distinguishing characteristics of the should-testing pre-congruence is that it abstracts from divergences in the same way as Milner¿s observation congruence, and as a consequence is strictly coarser than observation congruence. In other words, should-testing has a built-in fairness assumption. This is in itself a property long sought-after; it is in notable contrast to the well-known must-testing of De Nicola and Hennessy (denotationally characterised by a combination of failures and divergences), which treats divergence as catrastrophic and hence is incompatible with observation congruence. Due to these characteristics, should-testing supports modular reasoning and allows to use the proof techniques of observation congruence, but also supports additional laws and techniques. Moreover, we show decidability of should-testing (on the basis of the denotational characterisation). Finally, we demonstrate its advantages by the application to a number of examples, including a scheduling problem, a version of the Alternating Bit-protocol, and fair lossy communication channel

    Emerging trends proceedings of the 17th International Conference on Theorem Proving in Higher Order Logics: TPHOLs 2004

    Get PDF
    technical reportThis volume constitutes the proceedings of the Emerging Trends track of the 17th International Conference on Theorem Proving in Higher Order Logics (TPHOLs 2004) held September 14-17, 2004 in Park City, Utah, USA. The TPHOLs conference covers all aspects of theorem proving in higher order logics as well as related topics in theorem proving and verification. There were 42 papers submitted to TPHOLs 2004 in the full research cate- gory, each of which was refereed by at least 3 reviewers selected by the program committee. Of these submissions, 21 were accepted for presentation at the con- ference and publication in volume 3223 of Springer?s Lecture Notes in Computer Science series. In keeping with longstanding tradition, TPHOLs 2004 also offered a venue for the presentation of work in progress, where researchers invite discussion by means of a brief introductory talk and then discuss their work at a poster session. The work-in-progress papers are held in this volume, which is published as a 2004 technical report of the School of Computing at the University of Utah
    • …
    corecore