388 research outputs found

    SciKit-GStat Uncertainty: A software extension to cope with uncertain geostatistical estimates

    Get PDF
    This study is focused on an extension of a well established geostatistical software to enable one to effectively and interactively cope with uncertainty in geostatistical applications. The extension includes a rich component library, pre-built interfaces and an online application. We discuss the concept of replacing the empirical variogram with its uncertainty bound. This enables one to acknowledge uncertainties characterizing the underlying geostatistical datasets and typical methodological approaches. This allows for a probabilistic description of the variogram and its parameters at the same time. Our approach enables (1) multiple interpretations of a sample and (2) a multi-model context for geostatistical applications. We focus the sample application on propagating observation uncertainties into manual variogram parametrization and analyze its effects. Using two different datasets, we show how insights on uncertainty can be used to reject variogram models, thus constraining the space of formally equally probable models to tackle the issue of parameter equifinality

    The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8

    Get PDF
    This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat

    SciKit-GStat Uncertainty: A software extension to cope with uncertain geostatistical estimates

    Get PDF
    This study is focused on an extension of a well established geostatistical software to enable one to effectively and interactively cope with uncertainty in geostatistical applications. The extension includes a rich component library, pre-built interfaces and an online application. We discuss the concept of replacing the empirical variogram with its uncertainty bound. This enables one to acknowledge uncertainties characterizing the underlying geostatistical datasets and typical methodological approaches. This allows for a probabilistic description of the variogram and its parameters at the same time. Our approach enables (1) multiple interpretations of a sample and (2) a multi-model context for geostatistical applications. We focus the sample application on propagating observation uncertainties into manual variogram parametrization and analyze its effects. Using two different datasets, we show how insights on uncertainty can be used to reject variogram models, thus constraining the space of formally equally probable models to tackle the issue of parameter equifinality

    Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)

    Get PDF
    Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through routing models. The most important input to debris \ufb02ow routing models are the topographic data, usually in the form of Digital Elevation Models (DEMs). The quality of DEMs depends on the accuracy, density, and spatial distribution of the sampled points; on the characteristics of the surface; and on the applied gridding methodology. Therefore, the choice of the interpolation method affects the realistic representation of the channel and fan morphology, and thus potentially the debris \ufb02ow routing modeling outcomes. In this paper, we initially investigate the performance of common interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor, Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging) in building DEMs with the complex topography of a debris \ufb02ow channel located in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full- waveform Light Detection And Ranging (LiDAR) data. The investigation is carried out through a combination of statistical analysis of vertical accuracy, algorithm robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms on the performance of a Geographic Information System (GIS)-based cell model for simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation between the DEMs heights uncertainty resulting from the gridding procedure and that on the corresponding simulated erosion/deposition depths, both the effect of interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid discharges, and channel morphology after the event. The comparison among the tested interpolation methods highlights that the ANUDEM and ordinary kriging algorithms are not suitable for building DEMs with complex topography. Conversely, the linear triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy and shape reliability. Anyway, the evaluation of the effects of gridding techniques on debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does not signi\ufb01cantly affect the model outcomes

    Insect phenology: a geographical perspective

    Get PDF

    Accounting for a spatial trend in fine-scale ground-penetrating radar data: A comparative case study

    Get PDF
    In geostatistics, one of the challenges is to account for the spatial trend that is evident in a data-set. Two well-known kriging algorithms, namely universal kriging (UK) and intrinsic random function of order k (IRF-k), are mainly used to deal with the trend apparent in the data-set. These two algorithms differ in the way they account for the trend and they both have different advantages and drawbacks. In this study, the performances of UK, IRF-k, and ordinary kriging (OK) methods are compared on densely sampled ground-penetrating radar (GPR) data acquired to assist in delineation of the ore and waste contact within a laterite-type bauxite deposit. The original GPR data was first pre-processed to generate prediction and validation data sets in order to compare the estimation performance of each kriging algorithm. The structural analysis required for each algorithm was carried out and the resulting variograms and generalized covariance models were verified through cross-validation. The variable representing the elevation of the ore unit base was then estimated at the unknown locations using the prediction data-set. The estimated values were compared against the validation data using mean absolute error (MAE) and mean squared error (MSE) criteria. The results show although IRF-k slightly outperformed OK and UK, all the algorithms produced satisfactory and similar results. MSE values obtained from the comparison with the validation data were 0.1267, 0.1322, and 0.1349 for IRF-k, OK, and UK algorithms respectively. The similarity in the results generated by these algorithms is explained by the existence of a large data-set and the chosen neighbourhood parameters for the kriging technique

    Assessing the performance of several rainfall interpolation methods as evaluated by a conceptual hydrological model

    Get PDF
    ArtículoThe objective of this study was to assess the performance of several rainfall interpolation methods as evaluated by a conceptual hydrological model. To this purpose, the upper Toro River catchment (43.15 km2) located in Costa Rica was selected as case study. Deterministic and geostatistical interpolation methods were selected to generate time-series of daily and hourly average rainfall over a period of 10 years (2001-2010). These time-series were used as inputs for the HBV-TEC hydrological model and were individually calibrated against observed streamflow data. Based on model results, the performance of the deterministic methods can be said to be comparable to that of the geostatistical methods at daily time-steps. However, at hourly time-steps, deterministic methods considerably outperformed geostatistical methods

    Spatial statistics and analysis of earth's ionosphere

    Full text link
    Thesis (Ph.D.)--Boston UniversityThe ionosphere, a layer of Earths upper atmosphere characterized by energetic charged particles, serves as a natural plasma laboratory and supplies proxy diagnostics of space weather drivers in the magnetosphere and the solar wind. The ionosphere is a highly dynamic medium, and the spatial structure of observed features (such as auroral light emissions, charge density, temperature, etc.) is rich with information when analyzed in the context of fluid, electromagnetic, and chemical models. Obtaining measurements with higher spatial and temporal resolution is clearly advantageous. For instance, measurements obtained with a new electronically-steerable incoherent scatter radar (ISR) present a unique space-time perspective compared to those of a dish-based ISR. However, there are unique ambiguities for this modality which must be carefully considered. The ISR target is stochastic, and the fidelity of fitted parameters (ionospheric densities and temperatures) requires integrated sampling, creating a tradeoff between measurement uncertainty and spatio-temporal resolution. Spatial statistics formalizes the relationship between spatially dispersed observations and the underlying process(es) they represent. A spatial process is regarded as a random field with its distribution structured (e.g., through a correlation function) such that data, sampled over a spatial domain, support inference or prediction of the process. Quantification of uncertainty, an important component of scientific data analysis, is a core value of spatial statistics. This research applies the formalism of spatial statistics to the analysis of Earth's ionosphere using remote sensing diagnostics. In the first part, we consider the problem of volumetric imaging using phased-array ISR based on optimal spatial prediction ("kriging"). In the second part, we develop a technique for reconstructing two-dimensional ion flow fields from line-of-sight projections using Tikhonov regularization. In the third part, we adapt our spatial statistical approach to global ionospheric imaging using total electron content (TEC) measurements derived from navigation satellite signals
    • …
    corecore