148,686 research outputs found

    The Definition of Programming Languages

    Get PDF
    There is no need to argue in favor of concise, clear, complete, consistent, descriptions of programming languages, nor to recite the cost in time, energy, money, and effectiveness which is incurred when a description falls short of these standards. Reliable, high-quality computer programming is impossible without a clear and precise understanding of the language in which the programs are written—this being true quite independently of the merits of a language as a language. In this study we tried to discover the current state of the methodology of definition of programming languages. We sought to separate the question (as far as it can be done) of definition from that of design. Our goal was to get at ways of specifying programming languages rather than ways of inventing them or of implementing them on machines. We realize however that these questions are not completely separable. Many programming languages are poorly defined. Nowadays—indeed ever since the influential and pioneering example of the formal definition of ALGOL-60—one finds (usually) that the syntax of a programming language is defined very clearly and compactly, but that the semantics is (usually) explained badly

    Characterizing traits of coordination

    Full text link
    How can one recognize coordination languages and technologies? As this report shows, the common approach that contrasts coordination with computation is intellectually unsound: depending on the selected understanding of the word "computation", it either captures too many or too few programming languages. Instead, we argue for objective criteria that can be used to evaluate how well programming technologies offer coordination services. Of the various criteria commonly used in this community, we are able to isolate three that are strongly characterizing: black-box componentization, which we had identified previously, but also interface extensibility and customizability of run-time optimization goals. These criteria are well matched by Intel's Concurrent Collections and AstraKahn, and also by OpenCL, POSIX and VMWare ESX.Comment: 11 pages, 3 table

    Issues about the Adoption of Formal Methods for Dependable Composition of Web Services

    Full text link
    Web Services provide interoperable mechanisms for describing, locating and invoking services over the Internet; composition further enables to build complex services out of simpler ones for complex B2B applications. While current studies on these topics are mostly focused - from the technical viewpoint - on standards and protocols, this paper investigates the adoption of formal methods, especially for composition. We logically classify and analyze three different (but interconnected) kinds of important issues towards this goal, namely foundations, verification and extensions. The aim of this work is to individuate the proper questions on the adoption of formal methods for dependable composition of Web Services, not necessarily to find the optimal answers. Nevertheless, we still try to propose some tentative answers based on our proposal for a composition calculus, which we hope can animate a proper discussion

    A conceptual model for megaprogramming

    Get PDF
    Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described

    Formal change impact analyses for emulated control software

    Get PDF
    Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly

    Ada and the rapid development lifecycle

    Get PDF
    JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies

    FoCaLiZe: Inside an F-IDE

    Full text link
    For years, Integrated Development Environments have demonstrated their usefulness in order to ease the development of software. High-level security or safety systems require proofs of compliance to standards, based on analyses such as code review and, increasingly nowadays, formal proofs of conformance to specifications. This implies mixing computational and logical aspects all along the development, which naturally raises the need for a notion of Formal IDE. This paper examines the FoCaLiZe environment and explores the implementation issues raised by the decision to provide a single language to express specification properties, source code and machine-checked proofs while allowing incremental development and code reusability. Such features create strong dependencies between functions, properties and proofs, and impose an particular compilation scheme, which is described here. The compilation results are runnable OCaml code and a checkable Coq term. All these points are illustrated through a running example.Comment: In Proceedings F-IDE 2014, arXiv:1404.578
    • 

    corecore