113 research outputs found

    Conditioning Facies Simulations with Connectivity Data

    Get PDF
    When characterizing and simulating underground reservoirs for flow simulations, one of the key characteristics that needs to be reproduced accurately is its connectivity. More precisely, field observations frequently allow the identification of specific points in space that are connected. For example, in hydrogeology, tracer tests are frequently conducted that show which springs are connected to which sink-hole. Similarly well tests often allow connectivity information in a petroleum reservoir to be provided. To account for this type of information, we propose a new algorithm to condition stochastic simulations of lithofacies to connectivity information. The algorithm is based on the multiple-point philosophy but does not imply necessarily the use of multiple-point simulation. However, the challenge lies in generating realizations, for example of a binary medium, such that the connectivity information is honored as well as any prior structural information (e.g. as modeled through a training image). The algorithm consists of using a training image to build a set of replicates of connected paths that are consistent with the prior model. This is done by scanning the training image to find point locations that satisfy the constraints. Any path (a string of connected cells) between these points is therefore consistent with the prior model. For each simulation, one sample from this set of connected paths is sampled to generate hard conditioning data prior to running the simulation algorithm. The paper presents in detail the algorithm and some examples of two-dimensional and three-dimensional applications with multiple-point simulation

    BetaZero: Belief-State Planning for Long-Horizon POMDPs using Learned Approximations

    Full text link
    Real-world planning problems\unicode{x2014}including autonomous driving and sustainable energy applications like carbon storage and resource exploration\unicode{x2014}have recently been modeled as partially observable Markov decision processes (POMDPs) and solved using approximate methods. To solve high-dimensional POMDPs in practice, state-of-the-art methods use online planning with problem-specific heuristics to reduce planning horizons and make the problems tractable. Algorithms that learn approximations to replace heuristics have recently found success in large-scale problems in the fully observable domain. The key insight is the combination of online Monte Carlo tree search with offline neural network approximations of the optimal policy and value function. In this work, we bring this insight to partially observed domains and propose BetaZero, a belief-state planning algorithm for POMDPs. BetaZero learns offline approximations based on accurate belief models to enable online decision making in long-horizon problems. We address several challenges inherent in large-scale partially observable domains; namely challenges of transitioning in stochastic environments, prioritizing action branching with limited search budget, and representing beliefs as input to the network. We apply BetaZero to various well-established benchmark POMDPs found in the literature. As a real-world case study, we test BetaZero on the high-dimensional geological problem of critical mineral exploration. Experiments show that BetaZero outperforms state-of-the-art POMDP solvers on a variety of tasks.Comment: 20 page

    Universal kriging of functional data: trace-variography vs cross-variography? Application to forecasting in unconventional shales

    Get PDF
    In this paper we investigate the practical and methodological use of Universal Kriging of functional data to predict unconventional shale gas production in undrilled locations from known production data. In Universal Kriging of functional data, two approaches are considered: (1) estimation by means of Cokriging of functional components (Universal Cokriging, UCok), requiring cross-variography and (2) estimation by means of trace-variography (Universal Trace-Kriging, UTrK), which avoids cross-variogram modeling. While theoretically, under known variogram structures, such approaches may be quite equivalent, their practical application implies different estimation procedures and modeling efforts. We investigate these differences from the methodological viewpoint and by means of a real field application in the Barnett shale play. An extensive Monte Carlo study inspired from such real field application is employed to support our conclusions

    Optimizing Carbon Storage Operations for Long-Term Safety

    Full text link
    To combat global warming and mitigate the risks associated with climate change, carbon capture and storage (CCS) has emerged as a crucial technology. However, safely sequestering CO2 in geological formations for long-term storage presents several challenges. In this study, we address these issues by modeling the decision-making process for carbon storage operations as a partially observable Markov decision process (POMDP). We solve the POMDP using belief state planning to optimize injector and monitoring well locations, with the goal of maximizing stored CO2 while maintaining safety. Empirical results in simulation demonstrate that our approach is effective in ensuring safe long-term carbon storage operations. We showcase the flexibility of our approach by introducing three different monitoring strategies and examining their impact on decision quality. Additionally, we introduce a neural network surrogate model for the POMDP decision-making process to handle the complex dynamics of the multi-phase flow. We also investigate the effects of different fidelity levels of the surrogate model on decision qualities

    Uncertainty quantification of medium-term heat storage from short-term geophysical experiments using Bayesian Evidential Learning

    Get PDF
    In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and post-field data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements. Plain Language Summary : Geothermal energy can be extracted or stored in shallow aquifers through systems called aquifer thermal energy storage (ATES). In practice, the energy efficiency of those systems is often lower than expected because of the uncertainty related to the subsurface. To assess the uncertainty, a common method in the scientific community is to generate multiple models of the subsurface fitting the available data, a process called stochastic inversion. However this process is time consuming and difficult to apply in practice for real systems. In this contribution, we develop a novel approach to avoid the inversion process called Bayesian Evidential Learning. We are still using many models of the subsurface, but we do not try to fit the available data. Instead, we use the model to learn a direct relationship between the data and the response of interest to the user. For ATES systems, this response corresponds to the energy extracted from the system. It allows to predict the amount of energy extracted with a quantification of the uncertainty. This framework makes uncertainty assessment easier and faster, a prerequisite for robust risk analysis and decision making. We demonstrate the method in a feasibility study of ATES design
    • …
    corecore