533 research outputs found

    Geospatial Data Management Research: Progress and Future Directions

    Get PDF
    Without geospatial data management, today´s challenges in big data applications such as earth observation, geographic information system/building information modeling (GIS/BIM) integration, and 3D/4D city planning cannot be solved. Furthermore, geospatial data management plays a connecting role between data acquisition, data modelling, data visualization, and data analysis. It enables the continuous availability of geospatial data and the replicability of geospatial data analysis. In the first part of this article, five milestones of geospatial data management research are presented that were achieved during the last decade. The first one reflects advancements in BIM/GIS integration at data, process, and application levels. The second milestone presents theoretical progress by introducing topology as a key concept of geospatial data management. In the third milestone, 3D/4D geospatial data management is described as a key concept for city modelling, including subsurface models. Progress in modelling and visualization of massive geospatial features on web platforms is the fourth milestone which includes discrete global grid systems as an alternative geospatial reference framework. The intensive use of geosensor data sources is the fifth milestone which opens the way to parallel data storage platforms supporting data analysis on geosensors. In the second part of this article, five future directions of geospatial data management research are presented that have the potential to become key research fields of geospatial data management in the next decade. Geo-data science will have the task to extract knowledge from unstructured and structured geospatial data and to bridge the gap between modern information technology concepts and the geo-related sciences. Topology is presented as a powerful and general concept to analyze GIS and BIM data structures and spatial relations that will be of great importance in emerging applications such as smart cities and digital twins. Data-streaming libraries and “in-situ” geo-computing on objects executed directly on the sensors will revolutionize geo-information science and bridge geo-computing with geospatial data management. Advanced geospatial data visualization on web platforms will enable the representation of dynamically changing geospatial features or moving objects’ trajectories. Finally, geospatial data management will support big geospatial data analysis, and graph databases are expected to experience a revival on top of parallel and distributed data stores supporting big geospatial data analysis

    Map Services Management

    Get PDF
    About 20 years ago, Google and other companies introduced the tiled maps, and nowadays, it is possible to produce similar work using open data and open source software. Web Map Service and Tile Map Service are a set of open standards to provide ways for users to access and visualize maps by interacting with geospacial data, over the internet. Most of the solutions to provide maps, make use of geospacial databases like PostgreSQL/PostGIS or MBTiles/PMTiles. Dedicated servers follows the standards specified by organizations such as Open Geospatial Consortium. The main goal of this work is to create a centralized and scalable solution that publish basemaps for a predefined set of geographic regions. These basemaps are displayed as part of a desktop or mobile applications with internet access. In order to fulfill this purpose, the best approach is, for each geographic region, to generate a MBTile database using raw data extract of the OpenStreetMap packed by Geofabrik. The raw data are also combined with a second data source, Natural Earth, to complete the map information at smaller scales. The final result goes through a process of cartographic generalization to be able to access only the relevant geospatial data at a given map scale or zoom level. The data are published as vector tiles, using a tile server, and for legacy applications there’s also the possibility to display the basemaps as raster tiles. Another available option is to use PMTiles files, which are similar to MBTiles but cloud optimized and suitable for serverless solutions. In the interest of ensuring good performance and stability, it is possible to keep everything together behind a reverse proxy, using as an example a Nginx server. Taking advantage of HTTP range requests functionality, also available in Nginx, it is possible to consider the serverless option of PMTiles and the standard tile server under the same umbrella. Finally, two points were considered and explored as opportunities for improvement, however not fully implemented. The first is the ability to cache vector/raster tiles requests, and the second is the ability to deploy the solution supported by a Content Delivery Network.Google e outros serviços introduziram o tiled maps há cerca de 20 anos. Atualmente, é possível produzir trabalhos semelhantes usando dados e software de código abertos. Web Map Service e Tile Map Service são um conjunto de protocolos padrão abertos que fornecem aos utilizadores uma forma de acederem e visualizarem mapas interagindo com dados geoespaciais, através da Internet. A maioria das soluções que fornecem mapas fazem uso de bases de dados geoespaciais PostgreSQL/PostGIS ou MBTiles/PMTiles. Os servi dores são dedicados conforme normas padrão especificadas por instituições como a Open Geospatial Consortium. O principal objetivo deste trabalho é criar uma solução centralizada e escalável que publique mapas de base para um conjunto predefinido de regiões geográficas. Estes mapas de base devem ser mostrados numa aplicação desktop ou mobile com acesso à internet. De forma a atingir este propósito, a melhor abordagem é, para cada região geográfica, gerar uma base de dados MBTile, usando extratos de dados em bruto do OpenStreetMap disponibilizados pela Geofabrik. Os dados em bruto são também combinados com uma segunda fonte de dados, o Natural Earth, para completar a informação do mapa nas escalas menores. O resultado final passa por um processo de generalização cartográfica de forma a disponibilizar os dados geoespaciais relevantes para uma determinada escala ou um determinado nível de zoom do mapa. Os dados são publicados como vector tiles, usando um tile server, e para aplicações legacy também existe a possibilidade de disponibilizar os mapas em formato raster. Existe uma outra opção que consiste na utilização de ficheiros PMTile, que são ficheiros similares aos MBTiles mas otimizados para a cloud e disponibilizados num princípio serverless. De forma a garantir um bom desempenho e estabilidade, é possível agregar toda a solução atrás de uma reverse proxy usando por exemplo um servidor Nginx. Tirando partido da funcionalidade HTTP range requests, disponível também no Nginx, torna-se possível servir PMTiles (serverless) e Tile servers sob a mesma infraestrutura. Por fim, mais dois pontos foram considerados e explorados como oportunidades de melhoria, mas não foram totalmente implementados. O primeiro é a capacidade de armazenar em cache pedidos de Tiles vector/raster e o segundo é a capacidade de disponibilizar a solução apoiada num Content Delivery Network

    Towards intelligent geo-database support for earth system observation: Improving the preparation and analysis of big spatio-temporal raster data

    Get PDF
    The European COPERNICUS program provides an unprecedented breakthrough in the broad use and application of satellite remote sensing data. Maintained on a sustainable basis, the COPERNICUS system is operated on a free-and-open data policy. Its guaranteed availability in the long term attracts a broader community to remote sensing applications. In general, the increasing amount of satellite remote sensing data opens the door to the diverse and advanced analysis of this data for earth system science. However, the preparation of the data for dedicated processing is still inefficient as it requires time-consuming operator interaction based on advanced technical skills. Thus, the involved scientists have to spend significant parts of the available project budget rather on data preparation than on science. In addition, the analysis of the rich content of the remote sensing data requires new concepts for better extraction of promising structures and signals as an effective basis for further analysis. In this paper we propose approaches to improve the preparation of satellite remote sensing data by a geo-database. Thus the time needed and the errors possibly introduced by human interaction are minimized. In addition, it is recommended to improve data quality and the analysis of the data by incorporating Artificial Intelligence methods. A use case for data preparation and analysis is presented for earth surface deformation analysis in the Upper Rhine Valley, Germany, based on Persistent Scatterer Interferometric Synthetic Aperture Radar data. Finally, we give an outlook on our future research

    IMPROVING DATA QUALITY AND MANAGEMENT FOR REMOTE SENSING ANALYSIS: USE-CASES AND EMERGING RESEARCH QUESTIONS

    Get PDF
    During the last decades satellite remote sensing has become an emerging technology producing big data for various application fields every day. However, data quality checking as well as the long-time management of data and models are still issues to be improved. They are indispensable to guarantee smooth data integration and the reproducibility of data analysis such as carried out by machine learning models. In this paper we clarify the emerging need of improving data quality and the management of data and models in a geospatial database management system before and during data analysis. In different use cases various processes of data preparation and quality checking, integration of data across different scales and references systems, efficient data and model management, and advanced data analysis are presented in detail. Motivated by these use cases we then discuss emerging research questions concerning data preparation and data quality checking, data management, model management and data integration. Finally conclusions drawn from the paper are presented and an outlook on future research work is given

    The Nexus Between Security Sector Governance/Reform and Sustainable Development Goal-16

    Get PDF
    This Security Sector Reform (SSR) Paper offers a universal and analytical perspective on the linkages between Security Sector Governance (SSG)/SSR (SSG/R) and Sustainable Development Goal-16 (SDG-16), focusing on conflict and post-conflict settings as well as transitional and consolidated democracies. Against the background of development and security literatures traditionally maintaining separate and compartmentalized presence in both academic and policymaking circles, it maintains that the contemporary security- and development-related challenges are inextricably linked, requiring effective measures with an accurate understanding of the nature of these challenges. In that sense, SDG-16 is surely a good step in the right direction. After comparing and contrasting SSG/R and SDG-16, this SSR Paper argues that human security lies at the heart of the nexus between the 2030 Agenda of the United Nations (UN) and SSG/R. To do so, it first provides a brief overview of the scholarly and policymaking literature on the development-security nexus to set the background for the adoption of The Agenda 2030. Next, it reviews the literature on SSG/R and SDGs, and how each concept evolved over time. It then identifies the puzzle this study seeks to address by comparing and contrasting SSG/R with SDG-16. After making a case that human security lies at the heart of the nexus between the UN’s 2030 Agenda and SSG/R, this book analyses the strengths and weaknesses of human security as a bridge between SSG/R and SDG-16 and makes policy recommendations on how SSG/R, bolstered by human security, may help achieve better results on the SDG-16 targets. It specifically emphasizes the importance of transparency, oversight, and accountability on the one hand, and participative approach and local ownership on the other. It concludes by arguing that a simultaneous emphasis on security and development is sorely needed for addressing the issues under the purview of SDG-16

    Improving Data Acquisition Processes for Geospatial Building Information Applications

    Get PDF
    This study presents different technologies for processing geospatial building information in 2D models and discusses potential problems with an enormous growth in the data volume and availability of GIS software. The problems arise from collecting data from multiple data sources (e.g., mobile devices, websites, sensors, computers, GPS, or WFS) with different context problems (e.g., missing data, data formats, invalid values) and inefficient pre-processing data pipelines for examining the complex structure of spatial datasets. Thus, there is a need for a system that can manage such data automation issues in this case. One needs a data processing pipeline and data modeling for geospatial datasets. This process allows faster examination and visualization of the map to detect patterns. We present different GIS tools with various functionalities in handling geometric objects and introduce efficient data acquisition processing for these platforms. We conduct several experiments with these GIS applications to explore possibilities and program capabilities in terms of performance. The study analyzes the workflows for data collection, integration, and spatial data processing based on different formats, tools, and methods. The thesis studies and combines many techniques from GIS technologies to improve practices for software development teams and geospatial management systems. Data acquisition and integration apply these techniques to gain better optimization based on tool experiments and the user perspective. The findings provide the foundation for future work to have a standard methodology or processes for working with geospatial applications in file conversion, loading, processing, and exporting

    Big Data in Bioeconomy

    Get PDF
    This edited open access book presents the comprehensive outcome of The European DataBio Project, which examined new data-driven methods to shape a bioeconomy. These methods are used to develop new and sustainable ways to use forest, farm and fishery resources. As a European initiative, the goal is to use these new findings to support decision-makers and producers – meaning farmers, land and forest owners and fishermen. With their 27 pilot projects from 17 countries, the authors examine important sectors and highlight examples where modern data-driven methods were used to increase sustainability. How can farmers, foresters or fishermen use these insights in their daily lives? The authors answer this and other questions for our readers. The first four parts of this book give an overview of the big data technologies relevant for optimal raw material gathering. The next three parts put these technologies into perspective, by showing useable applications from farming, forestry and fishery. The final part of this book gives a summary and a view on the future. With its broad outlook and variety of topics, this book is an enrichment for students and scientists in bioeconomy, biodiversity and renewable resources

    A Web GIS-based Integration of 3D Digital Models with Linked Open Data for Cultural Heritage Exploration

    Get PDF
    This PhD project explores how geospatial semantic web concepts, 3D web-based visualisation, digital interactive map, and cloud computing concepts could be integrated to enhance digital cultural heritage exploration; to offer long-term archiving and dissemination of 3D digital cultural heritage models; to better interlink heterogeneous and sparse cultural heritage data. The research findings were disseminated via four peer-reviewed journal articles and a conference article presented at GISTAM 2020 conference (which received the ‘Best Student Paper Award’)
    corecore