564 research outputs found

    RichWPS Orchestration Environment for Geo Services

    Get PDF
    Mini-Symposium: Data Management in Hydro-Engineerin

    Geospatial Data Management Research: Progress and Future Directions

    Get PDF
    Without geospatial data management, today´s challenges in big data applications such as earth observation, geographic information system/building information modeling (GIS/BIM) integration, and 3D/4D city planning cannot be solved. Furthermore, geospatial data management plays a connecting role between data acquisition, data modelling, data visualization, and data analysis. It enables the continuous availability of geospatial data and the replicability of geospatial data analysis. In the first part of this article, five milestones of geospatial data management research are presented that were achieved during the last decade. The first one reflects advancements in BIM/GIS integration at data, process, and application levels. The second milestone presents theoretical progress by introducing topology as a key concept of geospatial data management. In the third milestone, 3D/4D geospatial data management is described as a key concept for city modelling, including subsurface models. Progress in modelling and visualization of massive geospatial features on web platforms is the fourth milestone which includes discrete global grid systems as an alternative geospatial reference framework. The intensive use of geosensor data sources is the fifth milestone which opens the way to parallel data storage platforms supporting data analysis on geosensors. In the second part of this article, five future directions of geospatial data management research are presented that have the potential to become key research fields of geospatial data management in the next decade. Geo-data science will have the task to extract knowledge from unstructured and structured geospatial data and to bridge the gap between modern information technology concepts and the geo-related sciences. Topology is presented as a powerful and general concept to analyze GIS and BIM data structures and spatial relations that will be of great importance in emerging applications such as smart cities and digital twins. Data-streaming libraries and “in-situ” geo-computing on objects executed directly on the sensors will revolutionize geo-information science and bridge geo-computing with geospatial data management. Advanced geospatial data visualization on web platforms will enable the representation of dynamically changing geospatial features or moving objects’ trajectories. Finally, geospatial data management will support big geospatial data analysis, and graph databases are expected to experience a revival on top of parallel and distributed data stores supporting big geospatial data analysis

    Review of Web Mapping: Eras, Trends and Directions

    Get PDF
    Web mapping and the use of geospatial information online have evolved rapidly over the past few decades. Almost everyone in the world uses mapping information, whether or not one realizes it. Almost every mobile phone now has location services and every event and object on the earth has a location. The use of this geospatial location data has expanded rapidly, thanks to the development of the Internet. Huge volumes of geospatial data are available and daily being captured online, and are used in web applications and maps for viewing, analysis, modeling and simulation. This paper reviews the developments of web mapping from the first static online map images to the current highly interactive, multi-sourced web mapping services that have been increasingly moved to cloud computing platforms. The whole environment of web mapping captures the integration and interaction between three components found online, namely, geospatial information, people and functionality. In this paper, the trends and interactions among these components are identified and reviewed in relation to the technology developments. The review then concludes by exploring some of the opportunities and directions

    Problems of Designing Geoportal Interfaces

    Get PDF
    The manuscript is devoted to analysis of the problem of designing graphical geoportal interfaces. The support points for the problem solutions are formulated and rationale of each of them is given. The emphasis was placed on the following orientations: to a flexible process of interface development, the need to introduce adaptability, progressive development, the motivated abandonment of geospatial content management systems and the use of third-party libraries where necessary, problem-solving and achieving goals. The lists of basic functional and qualitative requirements for graphical geoportal interfaces are given. In the last segment, the authors share their experience in the development of geoportal solutions

    Seafloor characterization using airborne hyperspectral co-registration procedures independent from attitude and positioning sensors

    Get PDF
    The advance of remote-sensing technology and data-storage capabilities has progressed in the last decade to commercial multi-sensor data collection. There is a constant need to characterize, quantify and monitor the coastal areas for habitat research and coastal management. In this paper, we present work on seafloor characterization that uses hyperspectral imagery (HSI). The HSI data allows the operator to extend seafloor characterization from multibeam backscatter towards land and thus creates a seamless ocean-to-land characterization of the littoral zone

    Geospatial Web Services, Open Standards, and Advances in Interoperability: A Selected, Annotated Bibliography

    Get PDF
    This paper is designed to help GIS librarians and information specialists follow developments in the emerging field of geospatial Web services (GWS). When built using open standards, GWS permits users to dynamically access, exchange, deliver, and process geospatial data and products on the World Wide Web, no matter what platform or protocol is used. Standards/specifications pertaining to geospatial ontologies, geospatial Web services and interoperability are discussed in this bibliography. Finally, a selected, annotated list of bibliographic references by experts in the field is presented

    Harmonisation Initiatives of Copernicus Data Quality Control

    Get PDF
    The Copernicus Space Component Data Access system (CSCDA) incorporates data contributions from a wide range of satellite missions. Through EO data handling and distribution, CSCDA serves a set of Copernicus Services related to Land, Marine and Atmosphere Monitoring, Emergency Management and Security and Climate Change. The quality of the delivered EO products is the responsibility of each contributing mission, and the Copernicus data Quality Control (CQC) service supports and complements such data quality control activities. The mission of the CQC is to provide a service of quality assessment on the provided imagery, to support the investigation related to product quality anomalies, and to guarantee harmonisation and traceability of the quality information. In terms of product quality control, the CQC carries out analysis of representative sample products for each contributing mission as well as coordinating data quality investigation related to issues found or raised by Copernicus users. Results from the product analysis are systematically collected and the derived quality reports stored in a searchable database. The CQC service can be seen as a privileged focal point with unique comparison capacities over the data providers. The comparison among products from different missions suggests the need for a strong, common effort of harmonisation. Technical terms, definitions, metadata, file formats, processing levels, algorithms, cal/val procedures etc. are far from being homogeneous, and this may generate inconsistencies and confusion among users of EO data. The CSCDA CQC team plays a significant role in promoting harmonisation initiatives across the numerous contributing missions, so that a common effort can achieve optimal complementarity and compatibility among the EO data from multiple data providers. This effort is done in coordination with important initiatives already working towards these goals (e.g. INSPIRE directive, CEOS initiatives, OGC standards, QA4EO etc.). This paper describes the main actions being undertaken by CQC to encourage harmonisation among space-based EO systems currently in service

    How can municipalities benefit by using IoT and AI with geodata to enhance their services? : The Case of City of Vienna

    Get PDF
    This study investigates which benefits occur for the different stakeholder groups of a municipality when a municipality enhances its internal and external services using the technologies IoT and AI with geodata. The stakeholder groups are which are public administration, enterprises, and citizens. The research method for this thesis was the usage of qualitative case study research. The Case study was focused on a single case – The City of Vienna. First, an analysis of strategy papers for Smart City, digitalization, AI, and IoT was done. Secondly, the documentation for the program" Wien gibt Raum", including the internal software" Kappazunder" and the One-Stop-Shop, were studied and described. Lastly, semi-structured interviews were conducted with five participants. The interviewees had diverse backgrounds. All interviewees were involved in the program" Wien gibt Raum" or have a long-term vision on the further data usage of the gathered data. As prior literature did not provide a suitable benefit framework, an adapted benefit framework was created. This adapted benefit framework is based on multiple benefit evaluation frameworks from prior literature. With the usage of this framework, it was possible to analyze the documentations and interview results. The analysis with the adapted benefit framework made it possible to demonstrate and categorize the benefits in an organized way. The analysis results show that the public administration's main benefits are efficiency and productivity improvements, better data quality, and data completeness. In the next step, with the usage of the newly generated data, a further efficiency increase, better decision-making, and better forecasts can be expected. The study showed that the citizens and enterprises mainly benefit indirectly from a better-managed city. A better-managed city will lead to time savings, easier accessibility, a better quality of the information provided by municipalities, and increased quality of life
    corecore