22,115 research outputs found

    Phobos: A front-end approach to extensible compilers (long version)

    Get PDF
    This paper describes a practical approach for implementing certain types of domain-specific languages with extensible compilers. Given a compiler with one or more front-end languages, we introduce the idea of a "generic" front-end that allows the syntactic and semantic specification of domain-specific languages. Phobos, our generic front-end, offers modular language specification, allowing the programmer to define new syntax and semantics incrementally

    Meta-Packages: Painless Domain Specific Languages

    Full text link
    Domain Specific Languages are used to provide a tailored modelling notation for a specific application domain. There are currently two main approaches to DSLs: standard notations that are tailored by adding simple properties; new notations that are designed from scratch. There are problems with both of these approaches which can be addressed by providing access to a small meta-language based on packages and classes. A meta-modelling approach based on meta-packages allows a wide range of DSLs to be defined in a standard way. The DSLs can be processed using standard object-based extension at the meta-level and existing tooling can easily be defined to adapt to the new languages. This paper introduces the concept of meta-packages and provides a simple example

    Towards structured sharing of raw and derived neuroimaging data across existing resources

    Full text link
    Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery

    From missions to systems : generating transparently distributable programs for sensor-oriented systems

    Get PDF
    Early Wireless Sensor Networks aimed simply to collect as much data as possible for as long as possible. While this remains true in selected cases, the majority of future sensor network applications will demand much more intelligent use of their resources as networks increase in scale and support multiple applications and users. Specifically, we argue that a computational model is needed in which the ways that data flows through networks, and the ways in which decisions are made based on that data, is transparently distributable and relocatable as requirements evolve. In this paper we present an approach to achieving this using high-level mission specifications from which we can automatically derive transparently distributable programs.Postprin

    A Domain-Specific Language and Editor for Parallel Particle Methods

    Full text link
    Domain-specific languages (DSLs) are of increasing importance in scientific high-performance computing to reduce development costs, raise the level of abstraction and, thus, ease scientific programming. However, designing and implementing DSLs is not an easy task, as it requires knowledge of the application domain and experience in language engineering and compilers. Consequently, many DSLs follow a weak approach using macros or text generators, which lack many of the features that make a DSL a comfortable for programmers. Some of these features---e.g., syntax highlighting, type inference, error reporting, and code completion---are easily provided by language workbenches, which combine language engineering techniques and tools in a common ecosystem. In this paper, we present the Parallel Particle-Mesh Environment (PPME), a DSL and development environment for numerical simulations based on particle methods and hybrid particle-mesh methods. PPME uses the meta programming system (MPS), a projectional language workbench. PPME is the successor of the Parallel Particle-Mesh Language (PPML), a Fortran-based DSL that used conventional implementation strategies. We analyze and compare both languages and demonstrate how the programmer's experience can be improved using static analyses and projectional editing. Furthermore, we present an explicit domain model for particle abstractions and the first formal type system for particle methods.Comment: Submitted to ACM Transactions on Mathematical Software on Dec. 25, 201
    • …
    corecore