407,609 research outputs found

    The Planetary Materials Database

    Get PDF
    NASA provides funds for a variety of research programs whose principal focus is to collect and analyze terrestrial analog materials. These data are used to (1) understand and interpret planetary geology; (2) identify and characterize habitable environments and pre-biotic/biotic processes; (3) interpret returned data from present and past missions; and (4) evaluate future mission and instrument concepts prior to selection for flight. Data management plans are now required for these programs, but the collected data are still not generally available to the community. There is also little possibility to re-analyze the collected materials by other techniques, since there is no requirement to archive collected samples. The Planetary Materials Database (PMD) is a central, high-quality, long-term data repository, which aims to promote the field of astrobiology and increase scientific returns from NASA funded research by enabling data sharing, collaboration and exposure of non-NASA scientists to NASA research initiatives and missions. The PMD is a linked collection of databases developed using the Open Data Repository (ODR) system. The PMD will include detailed descriptions of terrestrial analog planetary materials as well as data from the instruments used in their analysis. The goal is to provide example patterns/spectra/analyses, etc. and background information suitable for use by the Space Science community. An early example showing the utility of these databases (although not in the ODR format) is the RRUFF mineral database. RRUFF, comprising 4,000+ pure mineral standards, is the most popular and widely used dataset of minerals and receives more than 180,000 queries per week from geologists and mineralogists worldwide. The PMD will be patterned after the CheMin database [3], a resource that contains all of the data collected by the MSL CheMin XRD instrument on Mars. Raw and processed CheMin data can be viewed, downloaded, reprocessed and reanalyzed using cloud-based applications linked to the data

    When Things Matter: A Data-Centric View of the Internet of Things

    Full text link
    With the recent advances in radio-frequency identification (RFID), low-cost wireless sensor devices, and Web technologies, the Internet of Things (IoT) approach has gained momentum in connecting everyday objects to the Internet and facilitating machine-to-human and machine-to-machine communication with the physical world. While IoT offers the capability to connect and integrate both digital and physical entities, enabling a whole new class of applications and services, several significant challenges need to be addressed before these applications and services can be fully realized. A fundamental challenge centers around managing IoT data, typically produced in dynamic and volatile environments, which is not only extremely large in scale and volume, but also noisy, and continuous. This article surveys the main techniques and state-of-the-art research efforts in IoT from data-centric perspectives, including data stream processing, data storage models, complex event processing, and searching in IoT. Open research issues for IoT data management are also discussed

    The need for aquatic tracking networks: the permanent Belgian Acoustic Receiver Network

    Get PDF
    Aquatic biotelemetry techniques have proven to be valuable tools to generate knowledge on species behaviour, gather oceanographic data and help in assessing effects from anthropogenic disturbances. These data types support international policies and directives, needed for species and habitat conservation. As aquatic systems are highly interconnected and cross administrative borders, optimal data gathering should be organized on a large scale. This need triggered the development of regional, national and international aquatic animal tracking network initiatives around the globe. In Belgium, a national acoustic receiver network for fish tracking, called the Permanent Belgian Acoustic Receiver Network, was set up in 2014 with different research institutes collaborating. It is a permanent network with 160 acoustic receivers and since the start, over 800 animals from 16 different fish species have been tagged and generated more than 17 million detections so far. To handle all the (meta)data generated, a data management platform was built. The central database stores all the data and has an interactive web interface that allows the users to upload, manage and explore (meta)data. In addition, the database is linked to an R-shiny application to allow the user to visualize and download the detection data. The permanent tracking network is not only a collaborative platform for exchange of data, analysis tools, devices and knowledge. It also creates opportunities to perform feasibility studies and Ph.D. studies in a cost-efficient way. The Belgian tracking network is a first step towards a Pan-European aquatic tracking network

    The need for aquatic tracking networks : the Permanent Belgian Acoustic Receiver Network

    Get PDF
    Abstract Aquatic biotelemetry techniques have proven to be valuable tools to generate knowledge on species behaviour, gather oceanographic data and help in assessing effects from anthropogenic disturbances. These data types support international policies and directives, needed for species and habitat conservation. As aquatic systems are highly interconnected and cross administrative borders, optimal data gathering should be organized on a large scale. This need triggered the development of regional, national and international aquatic animal tracking network initiatives around the globe. In Belgium, a national acoustic receiver network for fish tracking, called the Permanent Belgian Acoustic Receiver Network, was set up in 2014 with different research institutes collaborating. It is a permanent network with 160 acoustic receivers and since the start, over 800 animals from 16 different fish species have been tagged and generated more than 17 million detections so far. To handle all the (meta)data generated, a data management platform was built. The central database stores all the data and has an interactive web interface that allows the users to upload, manage and explore (meta)data. In addition, the database is linked to an R-shiny application to allow the user to visualize and download the detection data. The permanent tracking network is not only a collaborative platform for exchange of data, analysis tools, devices and knowledge. It also creates opportunities to perform feasibility studies and Ph.D. studies in a cost-efficient way. The Belgian tracking network is a first step towards a Pan-European aquatic tracking network

    Evaluation of standards and techniques for retrieval of geospatial raster data : a study for the ICOS Carbon Portal

    Get PDF
    Evaluation of Standards and Techniques for Retrieval of Geospatial Raster Data - A study for ICOS Carbon Portal Geospatial raster data represent the world as a surface with its geographic information which varies continuously. These data can be grid-based data like Digital Terrain Elevation Data (DTED) and geographic image data like multispectral images. The Integrated Carbon Observation System (ICOS) European project is launched to measure greenhouse gases emission. The outputs of these measurements are the data in both geospatial vector (raw data) and raster formats (elaborated data). By using these measurements, scientists create flux maps over Europe. The flux maps are important for many groups such as researchers, stakeholders and public users. In this regard, ICOS Carbon Portal (ICOS CP) looks for a sufficient way to make the ICOS elaborated data available for all of these groups in an online environment. Among others, ICOS CP desires to design a geoportal to let users download the modelled geospatial raster data in different formats and geographic extents. Open GeoSpatial Consortium (OGC) Web Coverage Service (WCS) defines a geospatial web service to render geospatial raster data such as flux maps in any desired subset in space and time. This study presents two techniques to design a geoportal compatible with WCS. This geoportal should be able to retrieve the ICOS data in both NetCDF and GeoTIFF formats as well as allow retrieval of subsets in time and space. In the first technique, a geospatial raster database (Rasdaman) is used to store the data. Rasdaman OGC component (Petascope) as the server tool connects the database to the client side through WCS protocol. In the Second technique, an advanced file-based system (NetCDF) is applied to maintain the data. THREDDS as the WCS server ships the data to the client side through WCS protocol. These two techniques returned good result to download the data in desired formats and subsets.Evaluation of Standards and Techniques for Retrieval of Geospatial Raster Data Geospatial data refer to an object or phenomena located on the specific scene in space, in relation with the other objects. They are linked to geometry and topology. Geospatial raster data are a subset of geospatial data. Geospatial raster data represent the world as a surface with its geographic information which varies continuously. These data can be grid-based data like Digital Terrain Elevation Data (DTED) and geographic image data like multispectral images. The challenges present in working with geospatial raster data are related to three important components: I) storage and management systems, II) standardized services and III) software interface of geospatial raster data. Each component has its own importance in the aim of improving the interaction with geospatial raster data. A proper geospatial raster data storage and management system makes it easy to classify, search and retrieve the data. A standardized service is needed to unify, download, process and share these data among other users. The last challenge is choosing suitable software interface to support the standardized services on the web. The aim is to provide ability for users to download geospatial raster data in different formats in any desired space and time subsets. In this regard, two different techniques are evaluated to connect the main three components to provide such aim. In the first technique, a geospatial raster database is used to store the data. Then this database is connected to the software interface through standardized service. In the Second technique, an advanced file-based system is applied to maintain the data. The server ships the data to software interface through standardized service. Although these two techniques have their own difficulties, they returned good result. Users can download the data in desired formats on the web. In addition, they can download the data for any specific area and specific time

    Neogeography: The Challenge of Channelling Large and Ill-Behaved Data Streams

    Get PDF
    Neogeography is the combination of user generated data and experiences with mapping technologies. In this article we present a research project to extract valuable structured information with a geographic component from unstructured user generated text in wikis, forums, or SMSes. The extracted information should be integrated together to form a collective knowledge about certain domain. This structured information can be used further to help users from the same domain who want to get information using simple question answering system. The project intends to help workers communities in developing countries to share their knowledge, providing a simple and cheap way to contribute and get benefit using the available communication technology

    Implanting Life-Cycle Privacy Policies in a Context Database

    Get PDF
    Ambient intelligence (AmI) environments continuously monitor surrounding individuals' context (e.g., location, activity, etc.) to make existing applications smarter, i.e., make decision without requiring user interaction. Such AmI smartness ability is tightly coupled to quantity and quality of the available (past and present) context. However, context is often linked to an individual (e.g., location of a given person) and as such falls under privacy directives. The goal of this paper is to enable the difficult wedding of privacy (automatically fulfilling users' privacy whishes) and smartness in the AmI. interestingly, privacy requirements in the AmI are different from traditional environments, where systems usually manage durable data (e.g., medical or banking information), collected and updated trustfully either by the donor herself, her doctor, or an employee of her bank. Therefore, proper information disclosure to third parties constitutes a major privacy concern in the traditional studies
    • 

    corecore