23,011 research outputs found

    FogGIS: Fog Computing for Geospatial Big Data Analytics

    Full text link
    Cloud Geographic Information Systems (GIS) has emerged as a tool for analysis, processing and transmission of geospatial data. The Fog computing is a paradigm where Fog devices help to increase throughput and reduce latency at the edge of the client. This paper developed a Fog-based framework named Fog GIS for mining analytics from geospatial data. We built a prototype using Intel Edison, an embedded microprocessor. We validated the FogGIS by doing preliminary analysis. including compression, and overlay analysis. Results showed that Fog computing hold a great promise for analysis of geospatial data. We used several open source compression techniques for reducing the transmission to the cloud.Comment: 6 pages, 4 figures, 1 table, 3rd IEEE Uttar Pradesh Section International Conference on Electrical, Computer and Electronics (09-11 December, 2016) Indian Institute of Technology (Banaras Hindu University) Varanasi, Indi

    Geospatial Data Management Research: Progress and Future Directions

    Get PDF
    Without geospatial data management, today´s challenges in big data applications such as earth observation, geographic information system/building information modeling (GIS/BIM) integration, and 3D/4D city planning cannot be solved. Furthermore, geospatial data management plays a connecting role between data acquisition, data modelling, data visualization, and data analysis. It enables the continuous availability of geospatial data and the replicability of geospatial data analysis. In the first part of this article, five milestones of geospatial data management research are presented that were achieved during the last decade. The first one reflects advancements in BIM/GIS integration at data, process, and application levels. The second milestone presents theoretical progress by introducing topology as a key concept of geospatial data management. In the third milestone, 3D/4D geospatial data management is described as a key concept for city modelling, including subsurface models. Progress in modelling and visualization of massive geospatial features on web platforms is the fourth milestone which includes discrete global grid systems as an alternative geospatial reference framework. The intensive use of geosensor data sources is the fifth milestone which opens the way to parallel data storage platforms supporting data analysis on geosensors. In the second part of this article, five future directions of geospatial data management research are presented that have the potential to become key research fields of geospatial data management in the next decade. Geo-data science will have the task to extract knowledge from unstructured and structured geospatial data and to bridge the gap between modern information technology concepts and the geo-related sciences. Topology is presented as a powerful and general concept to analyze GIS and BIM data structures and spatial relations that will be of great importance in emerging applications such as smart cities and digital twins. Data-streaming libraries and “in-situ” geo-computing on objects executed directly on the sensors will revolutionize geo-information science and bridge geo-computing with geospatial data management. Advanced geospatial data visualization on web platforms will enable the representation of dynamically changing geospatial features or moving objects’ trajectories. Finally, geospatial data management will support big geospatial data analysis, and graph databases are expected to experience a revival on top of parallel and distributed data stores supporting big geospatial data analysis

    Geospatial data quality indicators

    Get PDF
    Indicators which summarise the characteristics of spatiotemporal data coverages significantly simplify quality evaluation, decision making and justification processes by providing a number of quality cues that are easy to manage and avoiding information overflow. Criteria which are commonly prioritised in evaluating spatial data quality and assessing a dataset’s fitness for use include lineage, completeness, logical consistency, positional accuracy, temporal and attribute accuracy. However, user requirements may go far beyond these broadlyaccepted spatial quality metrics, to incorporate specific and complex factors which are less easily measured. This paper discusses the results of a study of high level user requirements in geospatial data selection and data quality evaluation. It reports on the geospatial data quality indicators which were identified as user priorities, and which can potentially be standardised to enable intercomparison of datasets against user requirements. We briefly describe the implications for tools and standards to support the communication and intercomparison of data quality, and the ways in which these can contribute to the generation of a GEO label

    Geospatial Data Preservation Prime

    Get PDF
    This primer is one in a series of Operational Policy documents being developed by GeoConnections. It is intended to inform Canadian Geospatial Data Infrastructure (CGDI) stakeholders about the nature and scope of digital geospatial data archiving and preservation and the realities, challenges and good practices of related operational policies. Burgeoning growth of online geospatial applications and the deluge of data, combined with the growing complexity of archiving and preserving digital data, has revealed a significant gap in the operational policy coverage for the Canadian geospatial data infrastructure (CGDI). Currently there is no commonly accepted guidance for CGDI stakeholders wishing or mandated to preserve their geospatial data assets for long-term access and use. More specifically, there is little or no guidance available to inform operational policy decisions on how to manage, preserve and provide access to a digital geospatial data collection. The preservation of geospatial data over a period of time is especially important when datasets are required to inform modeling applications such as climate change impact predictions, flood forecasts and land use management. Furthermore, data custodians may have both a legal and moral responsibility to implement effective archiving and preservation programs. Based on research and analysis of the Canadian legislative framework and current international practices in digital data archiving and preservation, this primer provides guidance on the factors to be considered and the steps to be taken in planning and implementing a data archiving and preservation program. It describes an approach to establishing a geospatial data archives based on good practices from the literature and Canadian case studies. This primer will provide CGDI stakeholders with information on how to incorporate archiving and preservation considerations into an effective data management process that covers the entire life cycle (DCC, 2013) (LAC, 2006) of their geospatial data assets (i.e., creation and receipt, distribution, use, maintenance, and disposition. It is intended to inform CGDI stakeholders on the importance of long term data preservation, and provide them with the information and tools required to make policy decisions for creating an archives and preserving digital geospatial data

    Open Access Geospatial Data

    Get PDF
    The landscape of open-access geospatial data is vastly growing with an abundance of open source software, websites, and datasets available to the public. How do we find and access these resources? Find out from the Geospatial Services Center
    • …
    corecore