14,064 research outputs found

    The Task distribution preprocessor (TDP)

    Get PDF
    This document describes a software processor used to generate portable and efficient multiprocess programs that run on UNIX operating systems. The processor is designed to be a convenient method for converting single process C programs into distributed multiprocess C programs. Another goal is to have the processor used as a back end or platform for multitasking languages and code generators. Efficiency is targeted toward multiprocessor systems that provide facilities for sharing physical memory between processes running on separate CPUs. Portability is achieved with the use of the highly portable UNIX operating system and its companion C language. The C language is used as both the input and output language for the processor. Use of C as an object language gives portability to the generated multitask program. Using C as an input language, simplifies the interface with multitasking languages and code generators as well as minimizing changes necessary when converting C language single task pro grams to multitask programs. Initial implementation of the Task Distribution Preprocessor will generate C code that can be compiled and run on any UNIX system that provides message passing facilities

    VXA: A Virtual Architecture for Durable Compressed Archives

    Full text link
    Data compression algorithms change frequently, and obsolete decoders do not always run on new hardware and operating systems, threatening the long-term usability of content archived using those algorithms. Re-encoding content into new formats is cumbersome, and highly undesirable when lossy compression is involved. Processor architectures, in contrast, have remained comparatively stable over recent decades. VXA, an archival storage system designed around this observation, archives executable decoders along with the encoded content it stores. VXA decoders run in a specialized virtual machine that implements an OS-independent execution environment based on the standard x86 architecture. The VXA virtual machine strictly limits access to host system services, making decoders safe to run even if an archive contains malicious code. VXA's adoption of a "native" processor architecture instead of type-safe language technology allows reuse of existing "hand-optimized" decoders in C and assembly language, and permits decoders access to performance-enhancing architecture features such as vector processing instructions. The performance cost of VXA's virtualization is typically less than 15% compared with the same decoders running natively. The storage cost of archived decoders, typically 30-130KB each, can be amortized across many archived files sharing the same compression method.Comment: 14 pages, 7 figures, 2 table

    A study of System Interface Sets (SIS) for the host, target and integration environments of the Space Station Program (SSP)

    Get PDF
    System interface sets (SIS) for large, complex, non-stop, distributed systems are examined. The SIS of the Space Station Program (SSP) was selected as the focus of this study because an appropriate virtual interface specification of the SIS is believed to have the most potential to free the project from four life cycle tyrannies which are rooted in a dependance on either a proprietary or particular instance of: operating systems, data management systems, communications systems, and instruction set architectures. The static perspective of the common Ada programming support environment interface set (CAIS) and the portable common execution environment (PCEE) activities are discussed. Also, the dynamic perspective of the PCEE is addressed

    Study of Tools Interoperability

    Get PDF
    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation between these notions, and how it maps to the interoperability problem. We narrow the problem area to the tools development in academia. Tools developed in such environment have a small basis for development, documentation and maintenance. We scrutinise some of the problems and potential solutions related with tools interoperability in such environment. Moreover, we look at two tools developed in the Formal Methods and Tools group1, and analyse the use of different integration techniques
    • ā€¦
    corecore