4 research outputs found

    Modelling positional uncertainty of line features by accounting for stochastic deviations from straight line segments

    No full text
    The assessment of positional uncertainty in line and area features is often based on uncertainty in the coordinates of their elementary vertices which are assumed to be connected by straight lines. Such an approach disregards uncertainty caused by sampling and approximation of a curvilinear feature by a sequence of straight line segments. In this article, a method is proposed that also allows for the latter type of uncertainty by modelling random rectangular deviations from the conventional straight line segments. Using the model on a dense network of sub-vertices, the contribution of uncertainty due to approximation is emphasised; the sampling effect can be assessed by applying it on a small set of randomly inserted sub-vertices. A case study demonstrates a feasible way of parameterisation based on assumptions of joint normal distributions for positional errors of the vertices and the rectangular deviations and a uniform distribution of missed sub-vertices along line segments. Depending on the magnitudes of the different sources of uncertainty, not accounting for potential deviations from straight line segments may drastically underestimate the positional uncertainty of line features

    A Framework for Quality Evaluation of VGI linear datasets

    Get PDF
    Spatial data collection, processing, distribution and understanding have traditionally been handled by professionals. However, as technology advances, non-experts can now collect Geographic Information (GI), create spatial databases and distribute GI through web applications. This Volunteered Geographic Information (VGI), as it is called, seems to be a promising spatial data source. However, the most concerning issue is its unknown and heterogeneous quality, which cannot be handled by traditional quality measurement methods; the quality elements that these methods measure were standardised long before the appearance of VGI and they assume uniform quality behaviour. The lack of a suitable quality evaluation framework with an appropriate level of automation, which would enable the repetition of the quality assessment when VGI is updated, renders the choice of using it difficult or risky for potential users. This thesis proposes a framework for quality evaluation of linear VGI datasets, used to represent networks. The suggested automated methodology is based on a comparison of a VGI dataset with a dataset of known quality. The heterogeneity issue is handled by producing individual results for small areal units, using a tessellation grid. The quality elements measured are data completeness, attribute and positional accuracy, considered as most important for VGI. Compared to previous research, this thesis includes an automated data matching procedure, specifically designed for VGI. It combines geometric and thematic constraints, shifting the scale of importance from geometry to non-spatial attributes, depending on their existence in the VGI dataset. Based on the data matching results, all quality elements are then measured for corresponding objects, providing a more accurate quality assessment. The method is tested on three case studies. Data matching proves to be quite efficient, leading to more accurate quality results. The data completeness approach also tackles VGI over-completeness, which broadens the method usage for data fusion purposes
    corecore