386 research outputs found

    A method for matching crowd-sourced and authoritative geospatial data

    Get PDF
    A method for matching crowd-sourced and authoritative geospatial data is presented. A level of tolerance is defined as an input parameter as some difference in the geometry representation of a spatial object is to be expected. The method generates matches between spatial objects using location information and lexical information, such as names and types, and verifies consistency of matches using reasoning in qualitative spatial logic and description logic. We test the method by matching geospatial data from OpenStreetMap and the national mapping agencies of Great Britain and France. We also analyze how the level of tolerance affects the precision and recall of matching results for the same geographic area using 12 different levels of tolerance within a range of 1 to 80 meters. The generated matches show potential in helping enrich and update geospatial data

    A method for matching crowd-sourced and authoritative geospatial data

    Get PDF
    A method for matching crowd-sourced and authoritative geospatial data is presented. A level of tolerance is defined as an input parameter as some difference in the geometry representation of a spatial object is to be expected. The method generates matches between spatial objects using location information and lexical information, such as names and types, and verifies consistency of matches using reasoning in qualitative spatial logic and description logic. We test the method by matching geospatial data from OpenStreetMap and the national mapping agencies of Great Britain and France. We also analyze how the level of tolerance affects the precision and recall of matching results for the same geographic area using 12 different levels of tolerance within a range of 1 to 80 meters. The generated matches show potential in helping enrich and update geospatial data

    When worlds collide: combining Ordnance Survey and Open Street Map data

    Get PDF
    The context of this paper is the progress of national and international spatial data infrastructures such as the UK Location Programme and INSPIRE, contrasted against crowd-sourced geospatial databases such as Open Street Map. While initiatives such as INSPIRE tend towards a top-down process of harmonised data models and services using ISO & OGC standards, the OSM approach is one of tagged data with attribute tags agreed through consensus, but a tag set that can change with time (with inherent related issues of data quality). There is a danger that should the more formal approaches simply ignore the crowd sourced initiatives then they will miss an opportunity to evolve to better meet growing demands for geographic information. In any case both formal and informal data will increasingly coexist begging the question of how an end user gains maximum benefit from both. Ordnance Survey as the national mapping agency of Great Britain provides authoritative datasets with published data specifications driven by a combination of user need and the history of national mapping with a remit to ensure real-world feature changes are reflected in the OS large-scale data within 6 months. OSM in contrast relies on the availability of local mapping enthusiasts to capture changes but through its more informal structure can capture a broader range of features of interest to different sub-communities such as cyclists or horse riders. This research has been carried out to understand the issues of data integration between crowd sourced information and authoritative data. The aim of the research was to look into the mid-term and long-term effects of crowd sourcing technologies for understanding their effects on the change intelligence operations of national mapping agencies (NMAs) in the future. Mobile phones, with more computing power than the desktop machine of 5 years ago and incorporating built-in GPS receivers and cameras have become widespread and give people a multi-sensor capability. This combined with CCTV, sensor webs, RFID etc. offers the potential to make data capture pervasive and ubiquitous. All key sectors of modern economies will be affected by the developments in crowd sourcing of information. The synergies created by new technologies will create the conditions for exciting new developments in geospatial data integration. This has an impact in the spatial data collection domain especially in collecting vernacular and crowd-sourced information. Individual users will be able to use these technologies to collect location data and make it available for multiple applications without needing prior geospatial skills. The basic question behind our research is how do we combine data from authoritative OS data sets with feature-rich, informal OSM data, recognising the variable coverage of OSM while capturing the best of both worlds? There have been previous studies (Al-Bakri and Fairbairn, 2010) focussing on geometric accuracy assessment of crowd-sourced data(OSM) with OS data. Another important context is the rapid developments in Open Source GIS. The availability of free and open source GIS has made possible for large number of government organizations and SMEs to make use of GIS tools in their work. The Open Source Geospatial Foundation (OSGeo) is an excellent example of community initiative to support and promote the collaborative development of open geospatial technologies. OSGeo’s key mission is to promote the use of open source software in the geospatial industry and to encourage the implementation of open standards and standards based interoperability in its projects

    When worlds collide: combining Ordnance Survey and Open Street Map data

    Get PDF
    The context of this paper is the progress of national and international spatial data infrastructures such as the UK Location Programme and INSPIRE, contrasted against crowd-sourced geospatial databases such as Open Street Map. While initiatives such as INSPIRE tend towards a top-down process of harmonised data models and services using ISO & OGC standards, the OSM approach is one of tagged data with attribute tags agreed through consensus, but a tag set that can change with time (with inherent related issues of data quality). There is a danger that should the more formal approaches simply ignore the crowd sourced initiatives then they will miss an opportunity to evolve to better meet growing demands for geographic information. In any case both formal and informal data will increasingly coexist begging the question of how an end user gains maximum benefit from both. Ordnance Survey as the national mapping agency of Great Britain provides authoritative datasets with published data specifications driven by a combination of user need and the history of national mapping with a remit to ensure real-world feature changes are reflected in the OS large-scale data within 6 months. OSM in contrast relies on the availability of local mapping enthusiasts to capture changes but through its more informal structure can capture a broader range of features of interest to different sub-communities such as cyclists or horse riders. This research has been carried out to understand the issues of data integration between crowd sourced information and authoritative data. The aim of the research was to look into the mid-term and long-term effects of crowd sourcing technologies for understanding their effects on the change intelligence operations of national mapping agencies (NMAs) in the future. Mobile phones, with more computing power than the desktop machine of 5 years ago and incorporating built-in GPS receivers and cameras have become widespread and give people a multi-sensor capability. This combined with CCTV, sensor webs, RFID etc. offers the potential to make data capture pervasive and ubiquitous. All key sectors of modern economies will be affected by the developments in crowd sourcing of information. The synergies created by new technologies will create the conditions for exciting new developments in geospatial data integration. This has an impact in the spatial data collection domain especially in collecting vernacular and crowd-sourced information. Individual users will be able to use these technologies to collect location data and make it available for multiple applications without needing prior geospatial skills. The basic question behind our research is how do we combine data from authoritative OS data sets with feature-rich, informal OSM data, recognising the variable coverage of OSM while capturing the best of both worlds? There have been previous studies (Al-Bakri and Fairbairn, 2010) focussing on geometric accuracy assessment of crowd-sourced data(OSM) with OS data. Another important context is the rapid developments in Open Source GIS. The availability of free and open source GIS has made possible for large number of government organizations and SMEs to make use of GIS tools in their work. The Open Source Geospatial Foundation (OSGeo) is an excellent example of community initiative to support and promote the collaborative development of open geospatial technologies. OSGeo’s key mission is to promote the use of open source software in the geospatial industry and to encourage the implementation of open standards and standards based interoperability in its projects

    A flexible framework for assessing the quality of crowdsourced data

    Get PDF
    Ponencias, comunicaciones y pósters presentados en el 17th AGILE Conference on Geographic Information Science "Connecting a Digital Europe through Location and Place", celebrado en la Universitat Jaume I del 3 al 6 de junio de 2014.Crowdsourcing as a means of data collection has produced previously unavailable data assets and enriched existing ones, but its quality can be highly variable. This presents several challenges to potential end users that are concerned with the validation and quality assurance of the data collected. Being able to quantify the uncertainty, define and measure the different quality elements associated with crowdsourced data, and introduce means for dynamically assessing and improving it is the focus of this paper. We argue that the required quality assurance and quality control is dependent on the studied domain, the style of crowdsourcing and the goals of the study. We describe a framework for qualifying geolocated data collected from non-authoritative sources that enables assessment for specific case studies by creating a workflow supported by an ontological description of a range of choices. The top levels of this ontology describe seven pillars of quality checks and assessments that present a range of techniques to qualify, improve or reject data. Our generic operational framework allows for extension of this ontology to specific applied domains. This will facilitate quality assurance in real-time or for post-processing to validate data and produce quality metadata. It enables a system that dynamically optimises the usability value of the data captured. A case study illustrates this framework

    TOWARDS THE INTEGRATION OF AUTHORITATIVE AND OPENSTREETMAP GEOSPATIAL DATASETS IN SUPPORT OF THE EUROPEAN STRATEGY FOR DATA

    Get PDF
    Abstract. Digital transformation is at core of Europe's future and the importance of data is well highlighted by the recently published European strategy for data, which envisions the establishment of so-called European data spaces enabling seamless data flows across actors and sectors to ultimately boost the economy and generate innovation. Integrating datasets produced by multiple actors, including citizen-generated data, is a key objective of the strategy. This study focuses on OpenStreetMap (OSM), the most popular crowdsourced geographic information project, and is the first step towards an exploration of pros and cons of integrating its open-licensed data with authoritative geospatial datasets from European National Mapping Agencies. In contrast to previous work, which has only tested data integration at the local or regional level, an experiment was presented to integrate the national address dataset published by the National Land Survey (NLS) of Finland with the corresponding dataset from OSM. The process included the analysis of the two datasets, a mapping between their data models and a set of processing steps – performed using the open source QGIS software – to transform and finally combine their content. The resulting dataset confirms that, while addresses from the NLS are in general more complete across Finland, in some areas OSM addresses provide a higher detail and more up-to-date information to usefully complement the authoritative one. Whilst the analysis confirms that an integration between OSM and authoritative geospatial datasets is technically and semantically feasible, future work is needed to evaluate enablers and barriers that also exist at the legal and organisational level

    Matching disparate geospatial datasets and validating matches using spatial logic

    Get PDF
    In recent years, the emergence and development of crowd-sourced geospatial data has provided challenges and opportunities to national mapping agencies as well as commercial mapping organisations. Crowd-sourced data involves non-specialists in data collection, sharing and maintenance. Compared to authoritative geospatial data, which is collected by surveyors or other geodata professionals, crowd-sourced data is less accurate and less structured, but often provides richer user-based information and reflects real world changes more quickly at a much lower cost. In order to maximize the synergistic use of authoritative and crowd-sourced geospatial data, this research investigates the problem of how to establish and validate correspondences (matches) between spatial features from disparate geospatial datasets. To reason about and validate matches between spatial features, a series of new qualitative spatial logics was developed. Their soundness, completeness, decidability and complexity theorems were proved for models based on a metric space. A software tool `MatchMaps' was developed, which generates matches using location and lexical information, and verifies consistency of matches using reasoning in description logic and qualitative spatial logic. MatchMaps was evaluated by the author and experts from Ordnance Survey, the national mapping agency of Great Britain. In experiments, it achieved high precision and recall, as well as reduced human effort. The methodology developed and implemented in MatchMaps has a wider application than matching authoritative and crowd-sourced data and could be applied wherever it is necessary to match two geospatial datasets of vector data
    • …
    corecore