263 research outputs found

    The Application of DBSCAN Algorithm to Improve Variogram Estimation and Interpretation in Irregularly-Sampled Fields

    Get PDF
    The empirical variogram is a measure of spatial data correlation in geostatistical modeling and simulations. Typically, the empirical variogram is estimated for some defined lag intervals by applying method of moments on an underlying variogram cloud. Depending on the distribution of pair-wise lag values, the variogram cloud of an irregularly-sampled field may exhibit clusteredness. Issues of noisy, uninterpretable and inconsistent empirical variogram plots are commonly encountered in cases of irregularly-sampled fields with clustered variogram clouds. An insightful diagnosis of these problems and a practical solution are the subject of this paper. This research establishes the fact that these problems are caused by the neglect of variogram cloud cluster configurations when defining lag intervals. It is here shown that such neglect hinders the optimal use of spatial correlation information present in variogram clouds. Specifically, four sub-optimal effects are articulated in this paper as the consequence of the neglect. Consequently, this research presents an efficient cluster-analysis – driven technique for variogram estimation in cases of irregularly-sampled fields with clustered variogram clouds. The cluster analysis required for this technique is implemented using an unsupervised machine learning algorithm known as Density-based Spatial Clustering of Applications with Noise (DBSCAN). This technique has been applied to a real field to obtain a stable, interpretable and geologically consistent variogram plot. It has also been applied to a synthetic field and was found to give the lowest estimation error among other techniques. This technique would find usefulness in geo-modeling of natural resource deposits wherein irregular sampling is prevalent

    The Atlantic Ocean at the last glacial maximum: 1. Objective mapping of the GLAMAP sea-surface conditions

    Get PDF
    Recent efforts of the German paleoceanographic community have resulted in a unique data set of reconstructed sea-surface temperature for the Atlantic Ocean during the Last Glacial Maximum, plus estimates for the extents of glacial sea ice. Unlike prior attempts, the contributing research groups based their data on a common definition of the Last Glacial Maximum chronozone and used the same modern reference data for calibrating the different transfer techniques. Furthermore, the number of processed sediment cores was vastly increased. Thus the new data is a significant advance not only with respect to quality, but also to quantity. We integrate these new data and provide monthly data sets of global sea-surface temperature and ice cover, objectively interpolated onto a regular 1°x1° grid, suitable for forcing or validating numerical ocean and atmosphere models. This set is compared to an existing subjective interpolation of the same base data, in part by employing an ocean circulation model. For the latter purpose, we reconstruct sea surface salinity from the new temperature data and the available oxygen isotope measurements

    Lithofacies uncertainty modeling in a siliciclastic reservoir setting by incorporating geological contacts and seismic information

    Get PDF
    Deterministic modeling lonely provides a unique boundary layout, depending on the geological interpretation or interpolation from the hard available data. Changing the interpreter’s attitude or interpolation parameters leads to displacing the location of these borders. In contrary, probabilistic modeling of geological domains such as lithofacies is a critical aspect to providing information to take proper decision in the case of evaluation of oil reservoirs parameters, that is, applicable for quantification of uncertainty along the boundaries. These stochastic modeling manifests itself dramatically beyond this occasion. Conventional approaches of probabilistic modeling (object and pixel-based) mostly suffers from consideration of contact knowledge on the simulated domains. Plurigaussian simulation algorithm, in contrast, allows reproducing the complex transitions among the lithofacies domains and has found wide acceptance for modeling petroleum reservoirs. Stationary assumption for this framework has implications on the homogeneous characterization of the lithofacies. In this case, the proportion is assumed constant and the covariance function as a typical feature of spatial continuity depends only on the Euclidean distances between two points. But, whenever there exists a heterogeneity phenomenon in the region, this assumption does not urge model to generate the desired variability of the underlying proportion of facies over the domain. Geophysical attributes as a secondary variable in this place, plays an important role for generation of the realistic contact relationship between the simulated categories. In this paper, a hierarchical plurigaussian simulation approach is used to construct multiple realizations of lithofacies by incorporating the acoustic impedance as soft data through an oil reservoir in Iran.This research was funded by the National Elites Foundation of Iran in collaboration with research Institute Petroleum of Industry in Iran under the project number of 9265005

    Non-stationary covariance function modelling in 2D least-squares collocation

    Get PDF
    Standard least-squares collocation (LSC) assumes 2D stationarity and 3D isotropy, and relies on a covariance function to account for spatial dependence in the ob-served data. However, the assumption that the spatial dependence is constant through-out the region of interest may sometimes be violated. Assuming a stationary covariance structure can result in over-smoothing of, e.g., the gravity field in mountains and under-smoothing in great plains. We introduce the kernel convolution method from spatial statistics for non-stationary covariance structures, and demonstrate its advantage fordealing with non-stationarity in geodetic data. We then compared stationary and non-stationary covariance functions in 2D LSC to the empirical example of gravity anomaly interpolation near the Darling Fault, Western Australia, where the field is anisotropic and non-stationary. The results with non-stationary covariance functions are better than standard LSC in terms of formal errors and cross-validation against data not used in the interpolation, demonstrating that the use of non-stationary covariance functions can improve upon standard (stationary) LSC
    corecore