27,372 research outputs found

    Guppy: Process-Oriented Programming on Embedded Devices

    Get PDF
    Guppy is a new and experimental process-oriented programming language, taking much inspiration (and some code-base) from the existing occam-pi language. This paper reports on a variety of aspects related to this, specifically language, compiler and run-time system development, enabling Guppy programs to run on desktop and embedded systems. A native code-generation approach is taken, using C as the intermediate language, and with stack-space requirements determined at compile-time

    A Swiss Pocket Knife for Computability

    Get PDF
    This research is about operational- and complexity-oriented aspects of classical foundations of computability theory. The approach is to re-examine some classical theorems and constructions, but with new criteria for success that are natural from a programming language perspective. Three cornerstones of computability theory are the S-m-ntheorem; Turing's "universal machine"; and Kleene's second recursion theorem. In today's programming language parlance these are respectively partial evaluation, self-interpretation, and reflection. In retrospect it is fascinating that Kleene's 1938 proof is constructive; and in essence builds a self-reproducing program. Computability theory originated in the 1930s, long before the invention of computers and programs. Its emphasis was on delimiting the boundaries of computability. Some milestones include 1936 (Turing), 1938 (Kleene), 1967 (isomorphism of programming languages), 1985 (partial evaluation), 1989 (theory implementation), 1993 (efficient self-interpretation) and 2006 (term register machines). The "Swiss pocket knife" of the title is a programming language that allows efficient computer implementation of all three computability cornerstones, emphasising the third: Kleene's second recursion theorem. We describe experiments with a tree-based computational model aiming for both fast program generation and fast execution of the generated programs.Comment: In Proceedings Festschrift for Dave Schmidt, arXiv:1309.455

    From MinX to MinC: Semantics-Driven Decompilation of Recursive Datatypes

    Get PDF
    Reconstructing the meaning of a program from its binary executable is known as reverse engineering; it has a wide range of applications in software security, exposing piracy, legacy systems, etc. Since reversing is ultimately a search for meaning, there is much interest in inferring a type (a meaning) for the elements of a binary in a consistent way. Unfortunately existing approaches do not guarantee any semantic relevance for their reconstructed types. This paper presents a new and semantically-founded approach that provides strong guarantees for the reconstructed types. Key to our approach is the derivation of a witness program in a high-level language alongside the reconstructed types. This witness has the same semantics as the binary, is type correct by construction, and it induces a (justifiable) type assignment on the binary. Moreover, the approach effectively yields a type-directed decompiler. We formalise and implement the approach for reversing Minx, an abstraction of x86, to MinC, a type-safe dialect of C with recursive datatypes. Our evaluation compiles a range of textbook C algorithms to MinX and then recovers the original structures

    Complexity Hierarchies Beyond Elementary

    Full text link
    We introduce a hierarchy of fast-growing complexity classes and show its suitability for completeness statements of many non elementary problems. This hierarchy allows the classification of many decision problems with a non-elementary complexity, which occur naturally in logic, combinatorics, formal languages, verification, etc., with complexities ranging from simple towers of exponentials to Ackermannian and beyond.Comment: Version 3 is the published version in TOCT 8(1:3), 2016. I will keep updating the catalogue of problems from Section 6 in future revision

    A study of systems implementation languages for the POCCNET system

    Get PDF
    The results are presented of a study of systems implementation languages for the Payload Operations Control Center Network (POCCNET). Criteria are developed for evaluating the languages, and fifteen existing languages are evaluated on the basis of these criteria

    Survey on Combinatorial Register Allocation and Instruction Scheduling

    Full text link
    Register allocation (mapping variables to processor registers or memory) and instruction scheduling (reordering instructions to increase instruction-level parallelism) are essential tasks for generating efficient assembly code in a compiler. In the last three decades, combinatorial optimization has emerged as an alternative to traditional, heuristic algorithms for these two tasks. Combinatorial optimization approaches can deliver optimal solutions according to a model, can precisely capture trade-offs between conflicting decisions, and are more flexible at the expense of increased compilation time. This paper provides an exhaustive literature review and a classification of combinatorial optimization approaches to register allocation and instruction scheduling, with a focus on the techniques that are most applied in this context: integer programming, constraint programming, partitioned Boolean quadratic programming, and enumeration. Researchers in compilers and combinatorial optimization can benefit from identifying developments, trends, and challenges in the area; compiler practitioners may discern opportunities and grasp the potential benefit of applying combinatorial optimization
    corecore