68,513 research outputs found

    Semantic Technologies for Manuscript Descriptions — Concepts and Visions

    Get PDF
    The contribution at hand relates recent developments in the area of the World Wide Web to codicological research. In the last number of years, an informational extension of the internet has been discussed and extensively researched: the Semantic Web. It has already been applied in many areas, including digital information processing of cultural heritage data. The Semantic Web facilitates the organisation and linking of data across websites, according to a given semantic structure. Software can then process this structural and semantic information to extract further knowledge. In the area of codicological research, many institutions are making efforts to improve the online availability of handwritten codices. If these resources could also employ Semantic Web techniques, considerable research potential could be unleashed. However, data acquisition from less structured data sources will be problematic. In particular, data stemming from unstructured sources needs to be made accessible to SemanticWeb tools through information extraction techniques. In the area of museum research, the CIDOC Conceptual Reference Model (CRM) has been widely examined and is being adopted successfully. The CRM translates well to Semantic Web research, and its concentration on contextualization of objects could support approaches in codicological research. Further concepts for the creation and management of bibliographic coherences and structured vocabularies related to the CRM will be considered in this chapter. Finally, a user scenario showing all processing steps in their context will be elaborated on

    Computer Software Management and Information Center

    Get PDF
    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented

    Using Provenance to support Good Laboratory Practice in Grid Environments

    Get PDF
    Conducting experiments and documenting results is daily business of scientists. Good and traceable documentation enables other scientists to confirm procedures and results for increased credibility. Documentation and scientific conduct are regulated and termed as "good laboratory practice." Laboratory notebooks are used to record each step in conducting an experiment and processing data. Originally, these notebooks were paper based. Due to computerised research systems, acquired data became more elaborate, thus increasing the need for electronic notebooks with data storage, computational features and reliable electronic documentation. As a new approach to this, a scientific data management system (DataFinder) is enhanced with features for traceable documentation. Provenance recording is used to meet requirements of traceability, and this information can later be queried for further analysis. DataFinder has further important features for scientific documentation: It employs a heterogeneous and distributed data storage concept. This enables access to different types of data storage systems (e. g. Grid data infrastructure, file servers). In this chapter we describe a number of building blocks that are available or close to finished development. These components are intended for assembling an electronic laboratory notebook for use in Grid environments, while retaining maximal flexibility on usage scenarios as well as maximal compatibility overlap towards each other. Through the usage of such a system, provenance can successfully be used to trace the scientific workflow of preparation, execution, evaluation, interpretation and archiving of research data. The reliability of research results increases and the research process remains transparent to remote research partners.Comment: Book Chapter for "Data Provenance and Data Management for eScience," of Studies in Computational Intelligence series, Springer. 25 pages, 8 figure

    Development of an MSC language and compiler, volume 1

    Get PDF
    Higher order programming language and compiler for advanced computer software system to be used with manned space flights between 1972 and 198

    Catalog of Approaches to Impact Measurement: Assessing Social Impact in Private Ventures

    Get PDF
    To inform action impact investors could take to measure impact in a coordinated manner, The Rockefeller Foundation commissioned the study of impact assessment approaches presented here.It is natural to hope to find a single, turnkey solution that can address all measurement needs. In this study we conducted a survey of impact investors and complemented it with seven years of experience in the field of impact investing to discover what these investors want from impact measurement, and conducted in-depth interviews with over twenty entities that have developed and implemented approaches to measuring impact. Our survey of existing approaches was thorough but surely is not comprehensive; however the approaches are a good representation of the current state of play. What we found is that there is not one single measurement answer. Instead the answer depends on what solution is most appropriate for a particular investor's "impact profile" defined as the investor's level of risk tolerance and desired financial return, the particular sector in which the investor operates, geography, and credibility level of information about impact that the investor requires

    Operational alternatives for LANDSAT in California

    Get PDF
    Data integration is defined and examined as the means of promoting data sharing among the various governmental and private geobased information systems in California. Elements of vertical integration considered included technical factors (such as resolution and classification) and institutional factors (such as organizational control, and legal and political barriers). Attempts are made to fit the theoretical elements of vertical integration into a meaningful structure for looking at the problem from a statewide focus. Both manual (mapped) and machine readable data systems are included. Special attention is given to LANDSAT imagery because of its strong potential for integrated use and its primary in the California Integrated Remote Sensing System program

    IMAGINE Final Report

    No full text
    • …
    corecore