276 research outputs found

    Use of automated change detection and VGI sources for identifying and validating urban land use change

    Get PDF
    © 2020, by the authors. Land use and land cover (LULC) mapping is often undertaken by national mapping agencies, where these LULC products are used for different types of monitoring and reporting applications. Updating of LULC databases is often done on a multi-year cycle due to the high costs involved, so changes are only detected when mapping exercises are repeated. Consequently, the information on LULC can quickly become outdated and hence may be incorrect in some areas. In the current era of big data and Earth observation, change detection algorithms can be used to identify changes in urban areas, which can then be used to automatically update LULC databases on a more continuous basis. However, the change detection algorithm must be validated before the changes can be committed to authoritative databases such as those produced by national mapping agencies. This paper outlines a change detection algorithm for identifying construction sites, which represent ongoing changes in LU, developed in the framework of the LandSense project. We then use volunteered geographic information (VGI) captured through the use of mapathons from a range of different groups of contributors to validate these changes. In total, 105 contributors were involved in the mapathons, producing a total of 2778 observations. The 105 contributors were grouped according to six different user-profiles and were analyzed to understand the impact of the experience of the users on the accuracy assessment. Overall, the results show that the change detection algorithm is able to identify changes in residential land use to an adequate level of accuracy (85%) but changes in infrastructure and industrial sites had lower accuracies (57% and 75 %, respectively), requiring further improvements. In terms of user profiles, the experts in LULC from local authorities, researchers in LULC at the French national mapping agency (IGN), and first-year students with a basic knowledge of geographic information systems had the highest overall accuracies (86.2%, 93.2%, and 85.2%, respectively). Differences in how the users approach the task also emerged, e.g., local authorities used knowledge and context to try to identify types of change while those with no knowledge of LULC (i.e., normal citizens) were quicker to choose 'Unknown' when the visual interpretation of a class was more difficult

    Volunteered geographic information in natural hazard analysis : a systematic literature review of current approaches with a focus on preparedness and mitigation

    Get PDF
    With the rise of new technologies, citizens can contribute to scientific research via Web 2.0 applications for collecting and distributing geospatial data. Integrating local knowledge, personal experience and up-to-date geoinformation indicates a promising approach for the theoretical framework and the methods of natural hazard analysis. Our systematic literature review aims at identifying current research and directions for future research in terms of Volunteered Geographic Information (VGI) within natural hazard analysis. Focusing on both the preparedness and mitigation phase results in eleven articles from two literature databases. A qualitative analysis for in-depth information extraction reveals auspicious approaches regarding community engagement and data fusion, but also important research gaps. Mainly based in Europe and North America, the analysed studies deal primarily with floods and forest fires, applying geodata collected by trained citizens who are improving their knowledge and making their own interpretations. Yet, there is still a lack of common scientific terms and concepts. Future research can use these findings for the adaptation of scientific models of natural hazard analysis in order to enable the fusion of data from technical sensors and VGI. The development of such general methods shall contribute to establishing the user integration into various contexts, such as natural hazard analysis

    Quality Assessment of the Canadian OpenStreetMap Road Networks

    Get PDF
    Volunteered geographic information (VGI) has been applied in many fields such as participatory planning, humanitarian relief and crisis management because of its cost-effectiveness. However, coverage and accuracy of VGI cannot be guaranteed. OpenStreetMap (OSM) is a popular VGI platform that allows users to create or edit maps using GPS-enabled devices or aerial imageries. The issue of geospatial data quality in OSM has become a trending research topic because of the large size of the dataset and the multiple channels of data access. The objective of this study is to examine the overall reliability of the Canadian OSM data. A systematic review is first presented to provide details on the quality evaluation process of OSM. A case study of London, Ontario is followed as an experimental analysis of completeness, positional accuracy and attribute accuracy of the OSM street networks. Next, a national study of the Canadian OSM data assesses the overall semantic accuracy and lineage in addition to the quality measures mentioned above. Results of the quality evaluation are compared with associated OSM provenance metadata to examine potential correlations. The Canadian OSM road networks were found to have comparable accuracy with the tested commercial database (DMTI). Although statistical analysis suggests that there are no significant relations between OSM accuracy and its editing history, the study presents the complex processes behind OSM contributions possibly influenced by data import and remote mapping. The findings of this thesis can potentially guide cartographic product selection for interested parties and offer a better understanding of future quality improvement in OSM

    A Survey of Volunteered Open Geo-Knowledge Bases in the Semantic Web

    Full text link
    Over the past decade, rapid advances in web technologies, coupled with innovative models of spatial data collection and consumption, have generated a robust growth in geo-referenced information, resulting in spatial information overload. Increasing 'geographic intelligence' in traditional text-based information retrieval has become a prominent approach to respond to this issue and to fulfill users' spatial information needs. Numerous efforts in the Semantic Geospatial Web, Volunteered Geographic Information (VGI), and the Linking Open Data initiative have converged in a constellation of open knowledge bases, freely available online. In this article, we survey these open knowledge bases, focusing on their geospatial dimension. Particular attention is devoted to the crucial issue of the quality of geo-knowledge bases, as well as of crowdsourced data. A new knowledge base, the OpenStreetMap Semantic Network, is outlined as our contribution to this area. Research directions in information integration and Geographic Information Retrieval (GIR) are then reviewed, with a critical discussion of their current limitations and future prospects

    Mapping and the Citizen Sensor

    Get PDF
    Maps are a fundamental resource in a diverse array of applications ranging from everyday activities, such as route planning through the legal demarcation of space to scientific studies, such as those seeking to understand biodiversity and inform the design of nature reserves for species conservation. For a map to have value, it should provide an accurate and timely representation of the phenomenon depicted and this can be a challenge in a dynamic world. Fortunately, mapping activities have benefitted greatly from recent advances in geoinformation technologies. Satellite remote sensing, for example, now offers unparalleled data acquisition and authoritative mapping agencies have developed systems for the routine production of maps in accordance with strict standards. Until recently, much mapping activity was in the exclusive realm of authoritative agencies but technological development has also allowed the rise of the amateur mapping community. The proliferation of inexpensive and highly mobile and location aware devices together with Web 2.0 technology have fostered the emergence of the citizen as a source of data. Mapping presently benefits from vast amounts of spatial data as well as people able to provide observations of geographic phenomena, which can inform map production, revision and evaluation. The great potential of these developments is, however, often limited by concerns. The latter span issues from the nature of the citizens through the way data are collected and shared to the quality and trustworthiness of the data. This book reports on some of the key issues connected with the use of citizen sensors in mapping. It arises from a European Co-operation in Science and Technology (COST) Action, which explored issues linked to topics ranging from citizen motivation, data acquisition, data quality and the use of citizen derived data in the production of maps that rival, and sometimes surpass, maps arising from authoritative agencies

    Modeling and improving Spatial Data Infrastructure (SDI)

    Get PDF
    Spatial Data Infrastructure (SDI) development is widely known to be a challenging process owing to its complex and dynamic nature. Although great effort has been made to conceptually explain the complexity and dynamics of SDIs, few studies thus far have actually modeled these complexities. In fact, better modeling of SDI complexities will lead to more reliable plans for its development. A state-of-the-art simulation model of SDI development, hereafter referred to as SMSDI, was created by using the system dynamics (SD) technique. The SMSDI enables policy-makers to test various investment scenarios in different aspects of SDI and helps them to determine the optimum policy for further development of an SDI. This thesis begins with adaption of the SMSDI to a new case study in Tanzania by using the community of participant concept, and further development of the model is performed by using fuzzy logic. It is argued that the techniques and models proposed in this part of the study enable SDI planning to be conducted in a more reliable manner, which facilitates receiving the support of stakeholders for the development of SDI.Developing a collaborative platform such as SDI would highlight the differences among stakeholders including the heterogeneous data they produce and share. This makes the reuse of spatial data difficult mainly because the shared data need to be integrated with other datasets and used in applications that differ from those originally produced for. The integration of authoritative data and Volunteered Geographic Information (VGI), which has a lower level structure and production standards, is a new, challenging area. The second part of this study focuses on proposing techniques to improve the matching and integration of spatial datasets. It is shown that the proposed solutions, which are based on pattern recognition and ontology, can considerably improve the integration of spatial data in SDIs and enable the reuse or multipurpose usage of available data resources

    Geospatial crowdsourced data fitness analysis for spatial data infrastructure based disaster management actions

    Get PDF
    The reporting of disasters has changed from official media reports to citizen reporters who are at the disaster scene. This kind of crowd based reporting, related to disasters or any other events, is often identified as 'Crowdsourced Data' (CSD). CSD are freely and widely available thanks to the current technological advancements. The quality of CSD is often problematic as it is often created by the citizens of varying skills and backgrounds. CSD is considered unstructured in general, and its quality remains poorly defined. Moreover, the CSD's location availability and the quality of any available locations may be incomplete. The traditional data quality assessment methods and parameters are also often incompatible with the unstructured nature of CSD due to its undocumented nature and missing metadata. Although other research has identified credibility and relevance as possible CSD quality assessment indicators, the available assessment methods for these indicators are still immature. In the 2011 Australian floods, the citizens and disaster management administrators used the Ushahidi Crowd-mapping platform and the Twitter social media platform to extensively communicate flood related information including hazards, evacuations, help services, road closures and property damage. This research designed a CSD quality assessment framework and tested the quality of the 2011 Australian floods' Ushahidi Crowdmap and Twitter data. In particular, it explored a number of aspects namely, location availability and location quality assessment, semantic extraction of hidden location toponyms and the analysis of the credibility and relevance of reports. This research was conducted based on a Design Science (DS) research method which is often utilised in Information Science (IS) based research. Location availability of the Ushahidi Crowdmap and the Twitter data assessed the quality of available locations by comparing three different datasets i.e. Google Maps, OpenStreetMap (OSM) and Queensland Department of Natural Resources and Mines' (QDNRM) road data. Missing locations were semantically extracted using Natural Language Processing (NLP) and gazetteer lookup techniques. The Credibility of Ushahidi Crowdmap dataset was assessed using a naive Bayesian Network (BN) model commonly utilised in spam email detection. CSD relevance was assessed by adapting Geographic Information Retrieval (GIR) relevance assessment techniques which are also utilised in the IT sector. Thematic and geographic relevance were assessed using Term Frequency – Inverse Document Frequency Vector Space Model (TF-IDF VSM) and NLP based on semantic gazetteers. Results of the CSD location comparison showed that the combined use of non-authoritative and authoritative data improved location determination. The semantic location analysis results indicated some improvements of the location availability of the tweets and Crowdmap data; however, the quality of new locations was still uncertain. The results of the credibility analysis revealed that the spam email detection approaches are feasible for CSD credibility detection. However, it was critical to train the model in a controlled environment using structured training including modified training samples. The use of GIR techniques for CSD relevance analysis provided promising results. A separate relevance ranked list of the same CSD data was prepared through manual analysis. The results revealed that the two lists generally agreed which indicated the system's potential to analyse relevance in a similar way to humans. This research showed that the CSD fitness analysis can potentially improve the accuracy, reliability and currency of CSD and may be utilised to fill information gaps available in authoritative sources. The integrated and autonomous CSD qualification framework presented provides a guide for flood disaster first responders and could be adapted to support other forms of emergencies

    Geoinformatics in Citizen Science

    Get PDF
    The book features contributions that report original research in the theoretical, technological, and social aspects of geoinformation methods, as applied to supporting citizen science. Specifically, the book focuses on the technological aspects of the field and their application toward the recruitment of volunteers and the collection, management, and analysis of geotagged information to support volunteer involvement in scientific projects. Internationally renowned research groups share research in three areas: First, the key methods of geoinformatics within citizen science initiatives to support scientists in discovering new knowledge in specific application domains or in performing relevant activities, such as reliable geodata filtering, management, analysis, synthesis, sharing, and visualization; second, the critical aspects of citizen science initiatives that call for emerging or novel approaches of geoinformatics to acquire and handle geoinformation; and third, novel geoinformatics research that could serve in support of citizen science
    • …
    corecore