24,677 research outputs found

    Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)

    Get PDF
    Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through routing models. The most important input to debris \ufb02ow routing models are the topographic data, usually in the form of Digital Elevation Models (DEMs). The quality of DEMs depends on the accuracy, density, and spatial distribution of the sampled points; on the characteristics of the surface; and on the applied gridding methodology. Therefore, the choice of the interpolation method affects the realistic representation of the channel and fan morphology, and thus potentially the debris \ufb02ow routing modeling outcomes. In this paper, we initially investigate the performance of common interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor, Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging) in building DEMs with the complex topography of a debris \ufb02ow channel located in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full- waveform Light Detection And Ranging (LiDAR) data. The investigation is carried out through a combination of statistical analysis of vertical accuracy, algorithm robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms on the performance of a Geographic Information System (GIS)-based cell model for simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation between the DEMs heights uncertainty resulting from the gridding procedure and that on the corresponding simulated erosion/deposition depths, both the effect of interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid discharges, and channel morphology after the event. The comparison among the tested interpolation methods highlights that the ANUDEM and ordinary kriging algorithms are not suitable for building DEMs with complex topography. Conversely, the linear triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy and shape reliability. Anyway, the evaluation of the effects of gridding techniques on debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does not signi\ufb01cantly affect the model outcomes

    Modelling the spatial distribution of DEM Error

    Get PDF
    Assessment of a DEM’s quality is usually undertaken by deriving a measure of DEM accuracy – how close the DEM’s elevation values are to the true elevation. Measures such as Root Mean Squared Error and standard deviation of the error are frequently used. These measures summarise elevation errors in a DEM as a single value. A more detailed description of DEM accuracy would allow better understanding of DEM quality and the consequent uncertainty associated with using DEMs in analytical applications. The research presented addresses the limitations of using a single root mean squared error (RMSE) value to represent the uncertainty associated with a DEM by developing a new technique for creating a spatially distributed model of DEM quality – an accuracy surface. The technique is based on the hypothesis that the distribution and scale of elevation error within a DEM are at least partly related to morphometric characteristics of the terrain. The technique involves generating a set of terrain parameters to characterise terrain morphometry and developing regression models to define the relationship between DEM error and morphometric character. The regression models form the basis for creating standard deviation surfaces to represent DEM accuracy. The hypothesis is shown to be true and reliable accuracy surfaces are successfully created. These accuracy surfaces provide more detailed information about DEM accuracy than a single global estimate of RMSE

    Comparing Methods for Interpolation to Improve Raster Digital Elevation Models

    Get PDF
    Digital elevation models (DEMs) are available as raster files at 100m, 30m, and 10m resolutions for the contiguous United States and are used in a variety of geographic analyses. Some projects may require a finer resolution. GIS software offers many options for interpolating data to higher resolutions. We compared ten interpolation methods using 10m sample data from the Ouachita Mountains in central Arkansas. We interpolated the 10m DEM to 5m, 2.5m, and 1m resolutions and compared the absolute mean difference (AMD) for each using surveyed control points. Overall, there was little difference in the accuracy between interpolation methods at the resolutions tested and minimal departure from the original 10m raster

    Airborne LiDAR for DEM generation: some critical issues

    Get PDF
    Airborne LiDAR is one of the most effective and reliable means of terrain data collection. Using LiDAR data for DEM generation is becoming a standard practice in spatial related areas. However, the effective processing of the raw LiDAR data and the generation of an efficient and high-quality DEM remain big challenges. This paper reviews the recent advances of airborne LiDAR systems and the use of LiDAR data for DEM generation, with special focus on LiDAR data filters, interpolation methods, DEM resolution, and LiDAR data reduction. Separating LiDAR points into ground and non-ground is the most critical and difficult step for DEM generation from LiDAR data. Commonly used and most recently developed LiDAR filtering methods are presented. Interpolation methods and choices of suitable interpolator and DEM resolution for LiDAR DEM generation are discussed in detail. In order to reduce the data redundancy and increase the efficiency in terms of storage and manipulation, LiDAR data reduction is required in the process of DEM generation. Feature specific elements such as breaklines contribute significantly to DEM quality. Therefore, data reduction should be conducted in such a way that critical elements are kept while less important elements are removed. Given the highdensity characteristic of LiDAR data, breaklines can be directly extracted from LiDAR data. Extraction of breaklines and integration of the breaklines into DEM generation are presented

    An Ensemble Approach to Space-Time Interpolation

    Get PDF
    There has been much excitement and activity in recent years related to the relatively sudden availability of earth-related data and the computational capabilities to visualize and analyze these data. Despite the increased ability to collect and store large volumes of data, few individual data sets exist that provide both the requisite spatial and temporal observational frequency for many urban and/or regional-scale applications. The motivating view of this paper, however, is that the relative temporal richness of one data set can be leveraged with the relative spatial richness of another to fill in the gaps. We also note that any single interpolation technique has advantages and disadvantages. Particularly when focusing on the spatial or on the temporal dimension, this means that different techniques are more appropriate than others for specific types of data. We therefore propose a space- time interpolation approach whereby two interpolation methods – one for the temporal and one for the spatial dimension – are used in tandem in order to maximize the quality of the result. We call our ensemble approach the Space-Time Interpolation Environment (STIE). The primary steps within this environment include a spatial interpolator, a time-step processor, and a calibration step that enforces phenomenon-related behavioral constraints. The specific interpolation techniques used within the STIE can be chosen on the basis of suitability for the data and application at hand. In the current paper, we describe STIE conceptually including the structure of the data inputs and output, details of the primary steps (the STIE processors), and the mechanism for coordinating the data and the 1 processors. We then describe a case study focusing on urban land cover in Phoenix Arizona. Our empirical results show that STIE was effective as a space-time interpolator for urban land cover with an accuracy of 85.2% and furthermore that it was more effective than a single technique.

    An Ensemble Approach to Space-Time Interpolation

    Get PDF
    There has been much excitement and activity in recent years related to the relatively sudden availability of earth-related data and the computational capabilities to visualize and analyze these data. Despite the increased ability to collect and store large volumes of data, few individual data sets exist that provide both the requisite spatial and temporal observational frequency for many urban and/or regional-scale applications. The motivating view of this paper, however, is that the relative temporal richness of one data set can be leveraged with the relative spatial richness of another to fill in the gaps. We also note that any single interpolation technique has advantages and disadvantages. Particularly when focusing on the spatial or on the temporal dimension, this means that different techniques are more appropriate than others for specific types of data. We therefore propose a space- time interpolation approach whereby two interpolation methods – one for the temporal and one for the spatial dimension – are used in tandem in order to maximize the quality of the result. We call our ensemble approach the Space-Time Interpolation Environment (STIE). The primary steps within this environment include a spatial interpolator, a time-step processor, and a calibration step that enforces phenomenon-related behavioral constraints. The specific interpolation techniques used within the STIE can be chosen on the basis of suitability for the data and application at hand. In the current paper, we describe STIE conceptually including the structure of the data inputs and output, details of the primary steps (the STIE processors), and the mechanism for coordinating the data and the processors. We then describe a case study focusing on urban land cover in Phoenix, Arizona. Our empirical results show that STIE was effective as a space-time interpolator for urban land cover with an accuracy of 85.2% and furthermore that it was more effective than a single technique.

    A Preliminary Assessment of Tidal Flooding along the New Hampshire Coast: Past, Present and Future

    Get PDF
    This report presents the results of a preliminary study that examines several critical coastal issues for New Hampshire including sea level fluctuations (past, present and future), shoreline migrations, and tidal flooding. Included are: 1) an analysis of sea level changes over the Holocene and resulting shoreline migrations, 2) an assessment of low-lying areas with elevations below selected tidal flooding datums in coastal areas, and 3) an assessment of increases in low-lying areas that are potentially at risk to tidal flooding over the next century due to sea level rise
    • …
    corecore