93 research outputs found

    DAS: a data management system for instrument tests and operations

    Full text link
    The Data Access System (DAS) is a metadata and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.Comment: Accepted for pubblication on ADASS Conference Serie

    DAS User Manual

    Get PDF
    Progetto TECNO-INAF 2010 – CIWSThe Data Access System (DAS) is a software for storing, retrieving and querying data and meta-data acquired from instrument workstations or other processing steps. The DAS provides a powerful Data Definition Language (DDL) for describing a custom data model through different ddl-types. Each ddl- type can contain a meta-data section, a binary data section, which can describe a binary table or an image, and a set of relations with other ddl-types. Through the DAS API, the user can create new ddl- objects, populate its meta-data, data and then store the resulting object in the DAS system. Futhermore, the user can retrive a set of a ddl-objects, matching a filter specified as a query string, and then update the objects content. The aim of this document is to: provide a complete guide to installing the DAS system on the CIWSdev machine; provide a comprehensive guide to configuring the DAS system through the configuration files; describe the key concepts of how the DAS handles the ddl-objects persistency; describe the fundamental API functions and methods

    CIWS DAS Software Specification Document

    Get PDF
    Progetto TECNO- INAF 2010 – CIWSThe Data Access System (DAS) is part of the Customizable Instrument Workstation FrameWork software project, aimed at providing a framework, named CIWS-FW, for the storage, processing and quick-look at data acquired from space-borne and ground-based telescope observatories, to support the Assembly, Integration, Verification and Testing (AIV/AIT) activities on scientific instruments. The CIWS-FW should also facilitate the reuse of the instrument workstation software components for the subsequent Commissioning and Operations phases to be carried out either in the mission Ground Segment of space-borne experiments, or in the Observatory site of ground-based telescopes. The DAS system is a reusable software system that allows storage, retrieval and management of metadata and data acquired and processed by the instrument workstation (Level 1 data) or produced by the subsequent levels of data processing. It provides tools and application programming interfaces to: define the data model of a specific project; store the metadata and the association between data types in a relational database system; handle the persistence of binary data, ranging from small data objects to large binary data streams, and keeping consistency between metadata and corresponding data

    Planck LFI DPC-DPC SGS1 and SGS2 Flag Mask

    Get PDF
    The purpose of this Technical Note is to define the Flag mask to be applied at the level of SGS1 and SGS2 for the DPC purposes. A table on which flag of this general flag mask will be exported in the EFDD format will be summarized in the last chapter

    CIWS-FW: a Customizable InstrumentWorkstation Software Framework for instrument-independent data handling

    Get PDF
    The CIWS-FW is aimed at providing a common and standard solution for the storage, processing and quick look at the data acquired from scientific instruments for astrophysics. The target system is the instrument workstation either in the context of the Electrical Ground Support Equipment for space-borne experiments, or in the context of the data acquisition system for instrumentation. The CIWS-FW core includes software developed by team members for previous experiments and provides new components and tools that improve the software reusability, configurability and extensibility attributes. The CIWS-FW mainly consists of two packages: the data processing system and the data access system. The former provides the software components and libraries to support the data acquisition, transformation, display and storage in near real time of either a data packet stream and/or a sequence of data files generated by the instrument. The latter is a meta-data and data management system, providing a reusable solution for the archiving and retrieval of the acquired data. A built-in operator GUI allows to control and configure the IW. In addition, the framework provides mechanisms for system error and logging handling. A web portal provides the access to the CIWS-FW documentation, software repository and bug tracking tools for CIWS-FW developers. We will describe the CIWS-FW architecture and summarize the project status.Comment: Accepted for pubblication on ADASS Conference Serie

    Euclid's US Science Data Center: lessons learned from building a small part of a big system

    Get PDF
    Euclid is an ESA M-class mission to study the geometry and nature of the dark universe, slated for launch in mid-2022. NASA is participating in the mission through the contribution of the near-infrared detectors and associated electronics, the nomination of scientists for membership in the Euclid Consortium, and by establishing the Euclid NASA Science Center at IPAC (ENSCI) to support the US community. As part of ENSCI’s work, we will participate in the Euclid Science Ground Segment (SGS) and build and operate the US Science Data Center (SDC-US), which will be a node in the distributed data processing system for the mission. SDC-US is one of 10 data centers, and will contribute about 5% of the computing and data storage for the distributed system. We discuss lessons learned in developing a node in a distributed system. For example, there is a significant advantage to SDC-US development in sharing of knowledge, problem solving, and resource burden with other parts of the system. On the other hand, fitting into a system that is distributed geographically and relies on diverse computing environments results in added complexity in constructing SDC-US

    Planck LFI DPC Implementation Status Report

    Get PDF
    Version 1.0 reviewed by ESA at the Planck SGS Implementation Review (Jan 2007) Version 2.0 reviewed by ESA at the Planck SGS Readiness Review (2008)This report, the Planck/LFI DPC Implementation Status Report, is aimed at describing the status of the implementation of the LFI Data Processing Centre (DPC). It will be a self-standing document summarizing the status of the DPC pipeline implementation for the Planck SGS Readiness Review. The report includes all the development activities based on Work Packages and focuses on the main important topics at this stage of development. It should be noted that the SGS1 activity was reported, during the last three years, in the usual bimonthly report as recommended in the SGS Design Review (Nov 2004) and the SGS2 activity was reported in form of presentations during the Science Team Meetings, quarterly based

    Halpha rotation curves: the soft core question

    Full text link
    We present high resolution Halpha rotation curves of 4 late-type dwarf galaxies and 2 low surface brightness galaxies (LSB) for which accurate HI rotation curves are available from the literature. Observations are carried out at Telescopio Nazionale Galileo (TNG). For LSB F583-1 an innovative dispersing element was used, the Volume Phase Holographic (VPH) with a dispersion of about 0.35 A/pxl. We find good agreement between the Halpha data and the HI observations and conclude that the HI data for these galaxies suffer very little from beam smearing. We show that the optical rotation curves of these dark matter dominated galaxies are best fitted by the Burkert profile. In the centers of galaxies, where the N-body simulations predict cuspy cores and fast rising rotation curves, our data seem to be in better agreement with the presence of soft cores.Comment: Accepted for Publication in ApJ with minor changes require

    Management of the science ground segment for the Euclid mission

    Get PDF
    Euclid is an ESA mission aimed at understanding the nature of dark energy and dark matter by using simultaneously two probes (weak lensing and baryon acoustic oscillations). The mission will observe galaxies and clusters of galaxies out to z~2, in a wide extra-galactic survey covering 15000 deg2, plus a deep survey covering an area of 40 deg\ub2. The payload is composed of two instruments, an imager in the visible domain (VIS) and an imager-spectrometer (NISP) covering the near-infrared. The launch is planned in Q4 of 2020. The elements of the Euclid Science Ground Segment (SGS) are the Science Operations Centre (SOC) operated by ESA and nine Science Data Centres (SDCs) in charge of data processing, provided by the Euclid Consortium (EC), formed by over 110 institutes spread in 15 countries. SOC and the EC started several years ago a tight collaboration in order to design and develop a single, cost-efficient and truly integrated SGS. The distributed nature, the size of the data set, and the needed accuracy of the results are the main challenges expected in the design and implementation of the SGS. In particular, the huge volume of data (not only Euclid data but also ground based data) to be processed in the SDCs will require distributed storage to avoid data migration across SDCs. This paper describes the management challenges that the Euclid SGS is facing while dealing with such complexity. The main aspect is related to the organisation of a geographically distributed software development team. In principle algorithms and code is developed in a large number of institutes, while data is actually processed at fewer centers (the national SDCs) where the operational computational infrastructures are maintained. The software produced for data handling, processing and analysis is built within a common development environment defined by the SGS System Team, common to SOC and ECSGS, which has already been active for several years. The code is built incrementally through different levels of maturity, going from prototypes (developed mainly by scientists) to production code (engineered and tested at the SDCs). A number of incremental challenges (infrastructure, data processing and integrated) have been included in the Euclid SGS test plan to verify the correctness and accuracy of the developed systems

    Planck/LFI: DPC Processing and Use of Pointing Information

    Get PDF
    Il presente documento rappresenta la versione definitiva. / The current pdf is the final version.Beam attitude is the combination of beam pointing (where the beam is looking in the sky) and beam orientation (how the beam is oriented in the sky) and it is the combination of pointing and orientation information. Scope of this document is to describe how the pointing information is processed the LFI/DPC pipeline to derive the beam attitude information
    • …
    corecore