6 research outputs found

    Real-time satellite data processing platform architecture

    Get PDF
    Remote sensing satellites produce massive amounts of data of the earth every day. This earth observation data can be used to solve real world problems in many different fields. Finnish space data company Terramonitor has been using satellite data to produce new information for its customers. The Process for producing valuable information includes finding raw data, analysing it and visualizing it according to the client’s needs. This process contains a significant amount of manual work that is done at local workstations. Because satellite data can quickly become very big, it is not efficient to use unscalable processes that require lot of waiting time. This thesis is trying to solve the problem by introducing an architecture for cloud based real-time processing platform that allows satellite image analysis to be done in cloud environment. The architectural model is built using microservice patterns to ensure that the solution is scalable to match the changing demand

    Array-database Model (SciDB) for Standardized Storing of Hyperspectral Satellite Images

    Get PDF
    Treball Final del Màster Universitari Erasmus Mundus en Tecnologia Geoespacial (Pla de 2013). Codi: SIW013. Curs acadèmic 2015-2016Hyperspectral Imaging is a technique that collects information from the electromagnetic spectrum, storing the value of the spectrum band for each pixel of the image. This technique stands out for the contiguous wide range of wavelengths it covers; leading to the ability of accurate surface and material distinction. The big volumes of Hyperspectral Images datasets, which are called data cubes as the band value represent the third dimension, have been a barrier against exploiting the full potential of these images where there is no standardized way in storing them. On top of that, the classical relational databases proved to be an inconvenient storage space for such images. Array databases have been a serious choice for storing scientific and big volumes of data, and they represent a promising suitable environment for hyperspectral images. We aim to study the efficiency of storing hyperspectral images on an arraydatabase by suggesting a convenient data model. Furthermore, in order to examine the feasibility of this model, we make a comparison with two relational databases using specific measurements in performance and query complexit

    Web-based management of time-series raster data

    Get PDF
    Data discovery and data handling often presents serious challenges to organizations that manage huge archives of raster datasets such as those generated by satellite remote sensing. Satellite remote sensing produces a regular stream of raster datasets used in many applications including environmental and agricultural monitoring. This thesis presents a system architecture for the management of time-series GIS raster datasets. The architecture is then applied in a prototype implementation for a department that uses remote sensing data for agricultural monitoring. The architecture centres on three key components. The first is a metadatabase to hold metadata for the raster datasets, and an interface to manage the metadatabase and facilitate the search and discovery of raster metadata. The design of the metadatabase involved the examination of existing standards for geographic raster metadata and the determination of the metadata elements required for time-series raster data. The second component is an interactive tool for viewing the time-series raster data discovered via the metadatabase. The third component provides basic image analysis functionality typically required by users of time-series raster datasets. A prototype was implemented using open source software and following the Open Geospatial Consortium specifications for web map services (WMS) version 1.3.0. After implementation, an evaluation of the prototype was carried out by the target users from the RRSU (Regional Remote Sensing Unit) to assess the usability, the added value of the prototype and its impact on the work of the users. The evaluation showed that the prototype system was generally well received, since it allowed both the data managers and users of time-series datasets to save significant amounts of time in their work routines and it also offered some raster data analyses that are useful to a wider community of time-series raster data managers

    Evaluation of standards and techniques for retrieval of geospatial raster data : a study for the ICOS Carbon Portal

    Get PDF
    Evaluation of Standards and Techniques for Retrieval of Geospatial Raster Data - A study for ICOS Carbon Portal Geospatial raster data represent the world as a surface with its geographic information which varies continuously. These data can be grid-based data like Digital Terrain Elevation Data (DTED) and geographic image data like multispectral images. The Integrated Carbon Observation System (ICOS) European project is launched to measure greenhouse gases emission. The outputs of these measurements are the data in both geospatial vector (raw data) and raster formats (elaborated data). By using these measurements, scientists create flux maps over Europe. The flux maps are important for many groups such as researchers, stakeholders and public users. In this regard, ICOS Carbon Portal (ICOS CP) looks for a sufficient way to make the ICOS elaborated data available for all of these groups in an online environment. Among others, ICOS CP desires to design a geoportal to let users download the modelled geospatial raster data in different formats and geographic extents. Open GeoSpatial Consortium (OGC) Web Coverage Service (WCS) defines a geospatial web service to render geospatial raster data such as flux maps in any desired subset in space and time. This study presents two techniques to design a geoportal compatible with WCS. This geoportal should be able to retrieve the ICOS data in both NetCDF and GeoTIFF formats as well as allow retrieval of subsets in time and space. In the first technique, a geospatial raster database (Rasdaman) is used to store the data. Rasdaman OGC component (Petascope) as the server tool connects the database to the client side through WCS protocol. In the Second technique, an advanced file-based system (NetCDF) is applied to maintain the data. THREDDS as the WCS server ships the data to the client side through WCS protocol. These two techniques returned good result to download the data in desired formats and subsets.Evaluation of Standards and Techniques for Retrieval of Geospatial Raster Data Geospatial data refer to an object or phenomena located on the specific scene in space, in relation with the other objects. They are linked to geometry and topology. Geospatial raster data are a subset of geospatial data. Geospatial raster data represent the world as a surface with its geographic information which varies continuously. These data can be grid-based data like Digital Terrain Elevation Data (DTED) and geographic image data like multispectral images. The challenges present in working with geospatial raster data are related to three important components: I) storage and management systems, II) standardized services and III) software interface of geospatial raster data. Each component has its own importance in the aim of improving the interaction with geospatial raster data. A proper geospatial raster data storage and management system makes it easy to classify, search and retrieve the data. A standardized service is needed to unify, download, process and share these data among other users. The last challenge is choosing suitable software interface to support the standardized services on the web. The aim is to provide ability for users to download geospatial raster data in different formats in any desired space and time subsets. In this regard, two different techniques are evaluated to connect the main three components to provide such aim. In the first technique, a geospatial raster database is used to store the data. Then this database is connected to the software interface through standardized service. In the Second technique, an advanced file-based system is applied to maintain the data. The server ships the data to software interface through standardized service. Although these two techniques have their own difficulties, they returned good result. Users can download the data in desired formats on the web. In addition, they can download the data for any specific area and specific time
    corecore