47 research outputs found

    Integration of geostatistical realizations in data assimilation and reduction of uncertainty process using genetic algorithm combined with multi-start simulated annealing

    Get PDF
    International audienceThis paper introduces a new methodology, combining a Genetic Algorithm (GA) with multi-start simulated annealing to integrate Geostatistical Realizations (GR) in data assimilation and uncertainty reduction process. The proposed approach, named Genetic Algorithm with Multi-Start Simulated Annealing (GAMSSA), comprises two parts. The first part consists of running a GA several times, starting with certain number of geostatistical realizations, and the second part consists of running the Multi-Start Simulated Annealing with Geostatistical Realizations (MSSAGR). After each execution of GA, the best individuals of each generation are selected and used as starting point to the MSSAGR. To preserve the diversity of the geostatistical realizations, a rule is imposed to guarantee that a given realization is not repeated among the selected individuals from the GA. This ensures that each Simulated Annealing (SA) process starts from a different GR. Each SA process is responsible for local improvement of the best individuals by performing local perturbation in other reservoir properties such as relative permeability, water-oil contact, etc. The proposed methodology was applied to a complex benchmark case (UNISIM-I-H) based on the Namorado Field, located in the Campos Basin, Brazil, with 500 geostatistical realizations and other 22 attributes comprising relative permeability, oil-water contact, and rock compressibility. Comparisons with a conventional GA algorithm are also shown. The proposed method was able to find multiple solutions while preserving the diversity of the geostatistical realizations and the variability of the other attributes. The matched models found by the GAMSSA method provided more reliable forecasts when compared with the matched models found by the GA

    Flow Simulation Using Local Grid Refinements to Model Laminated Reservoirs

    Get PDF
    International audienceSuper-giant carbonate fields, such as Ghawar, in Saudi Arabia, and Lula, at the Brazilian pre-salt, show highly heterogeneous behavior that is linked to high permeability intervals in thin layers. This article applies Local Grid Refinements (LGR) integrated with upscaling procedures to improve the representation of highly laminated reservoirs in flow simulation by preserving the static properties and dynamic trends from geological model. This work was developed in five main steps: (1) define a conventional coarse grid, (2) define LGR in the conventional coarse grid according to super-k and well locations, (3) apply an upscaling procedure for all scenarios, (4) define LGR directly in the simulation model, without integrate geological trends in LGR and (5) compare the dynamic response for all cases. To check results and compare upscaling matches, was used the benchmark model UNISIM-II-R, a refined model based on a combination of Brazilian Pre-salt and Ghawar field information. The main results show that the upscaling of geological models for coarse grid with LGR in highly permeable thin layers provides a close dynamic representation of geological characterization compared to conventional coarse grid and LGR only near-wells. Pseudo-relative permeability curves should be considered for (a) conventional coarse grid or (b) LGR scenarios under dual-medium flow simulations as the upscaling of discrete fracture networks and dual-medium flow models presents several limitations. The conventional approach of LGR directly in simulation model, presents worse results than LGR integrated with upscaling procedures as the extrapolation of dynamic properties to the coarse block mismatch the dynamic behavior from geological characterization. This work suggests further improvements for results for upscaling procedures that mask the flow behavior in highly laminated reservoirs

    Evaluation of an uncertainty reduction methodology based on Iterative Sensitivity Analysis (ISA) applied to naturally fractured reservoirs

    Get PDF
    International audienceHistory matching for naturally fractured reservoirs is challenging because of the complexity of flow behavior in the fracture-matrix combination. Calibrating these models in a history-matching procedure normally requires integration with geostatistical techniques (Big Loop, where the history matching is integrated to reservoir modeling) for proper model characterization. In problems involving complex reservoir models, it is common to apply techniques such as sensitivity analysis to evaluate and identify most influential attributes to focus the efforts on what most impact the response. Conventional Sensitivity Analysis (CSA), in which a subset of attributes is fixed at a unique value, may over-reduce the search space so that it might not be properly explored. An alternative is an Iterative Sensitivity Analysis (ISA), in which CSA is applied multiple times throughout the iterations. ISA follows three main steps: (a) CSA identifies Group i of influential attributes (i = 1, 2, 3, …, n); (b) reduce uncertainty of Group i, with other attributes with fixed values; and (c) return to step (a) and repeat the process. Conducting CSA multiple times allows the identification of influential attributes hidden by the high uncertainty of the most influential attributes. In this work, we assess three methods: Method 1 – ISA, Method 2 – CSA, and Method 3 – without sensitivity analysis, i.e., varying all uncertain attributes (larger searching space). Results showed that the number of simulation runs for Method 1 dropped 24% compared to Method 3 and 12% to Method 2 to reach a similar matching quality of acceptable models. In other words, Method 1 reached a similar quality of results with fewer simulations. Therefore, ISA can perform as good as CSA demanding fewer simulations. All three methods identified the same five most influential attributes of the initial 18. Even with many uncertain attributes, only a small percentage is responsible for most of the variability of responses. Also, their identification is essential for efficient history matching. For the case presented in this work, few fracture attributes were responsible for most of the variability of the responses

    Petroleum reservoir uncertainty mitigation through the integration with production history matching

    Get PDF
    This paper presents a new methodology to deal with uncertainty mitigation using observed data, integrating the uncertainty analysis and the history matching processes. The proposed method is robust and easy to use, offering an alternative way to traditional history matching methodologies. The main characteristic of the methodology is the use of observed data as constraints to reduce the uncertainty of the reservoir parameters. The integration of uncertainty analysis with history matching naturally yields prediction under uncertainty. The workflow permits to establish a target range of uncertainty that characterize a confidence interval of the probabilistic distribution curves around the observed data. A complete workflow of the proposed methodology was carried out in a realistic model based on outcrop data and the impact of the uncertainty reduction in the production forecasting was evaluated. It was demonstrated that for complex cases, with a high number of uncertain attributes and several objective-function, the methodology can be applied in steps, beginning with a field analysis followed by regional and local (well level) analyses. The main contribution of this work is to provide an interesting way to quantify and to reduce uncertainties with the objective to generate reliable scenario-based models for consistent production prediction.147158Conselho Nacional de Desenvolvimento CientĂ­fico e TecnolĂłgico (CNPq

    Evaluation of an uncertainty reduction methodology based on Iterative Sensitivity Analysis (ISA) applied to naturally fractured reservoirs

    Get PDF
    History matching for naturally fractured reservoirs is challenging because of the complexity of flow behavior in the fracture-matrix combination. Calibrating these models in a history-matching procedure normally requires integration with geostatistical techniques (Big Loop, where the history matching is integrated to reservoir modeling) for proper model characterization. In problems involving complex reservoir models, it is common to apply techniques such as sensitivity analysis to evaluate and identify most influential attributes to focus the efforts on what most impact the response. Conventional Sensitivity Analysis (CSA), in which a subset of attributes is fixed at a unique value, may over-reduce the search space so that it might not be properly explored. An alternative is an Iterative Sensitivity Analysis (ISA), in which CSA is applied multiple times throughout the iterations. ISA follows three main steps: (a) CSA identifies Group i of influential attributes (i = 1, 2, 3, …, n); (b) reduce uncertainty of Group i, with other attributes with fixed values; and (c) return to step (a) and repeat the process. Conducting CSA multiple times allows the identification of influential attributes hidden by the high uncertainty of the most influential attributes. In this work, we assess three methods: Method 1 – ISA, Method 2 – CSA, and Method 3 – without sensitivity analysis, i.e., varying all uncertain attributes (larger searching space). Results showed that the number of simulation runs for Method 1 dropped 24% compared to Method 3 and 12% to Method 2 to reach a similar matching quality of acceptable models. In other words, Method 1 reached a similar quality of results with fewer simulations. Therefore, ISA can perform as good as CSA demanding fewer simulations. All three methods identified the same five most influential attributes of the initial 18. Even with many uncertain attributes, only a small percentage is responsible for most of the variability of responses. Also, their identification is essential for efficient history matching. For the case presented in this work, few fracture attributes were responsible for most of the variability of the responses

    Developing a workflow to represent fractured carbonate reservoirs for simulation models under uncertainties based on flow unit concept

    Get PDF
    International audienceDescription of fractured reservoir rock under uncertainties in a 3D model and integration with reservoir simulation is still a challenging topic. In particular, mapping the potential zones with a reservoir quality can be very useful for making decisions and support development planning. This mapping can be done through the concept of flow units. In this paper, an integrated approach including a Hierarchical Cluster Analysis (HCA), geostatistical modeling and uncertainty analysis is developed and applied to a fractured carbonate in order to integrate on numerical simulation. The workflow begins with different HCA methods, performed to well-logs in three wells, to identify flow units and rock types. Geostatistical techniques are then applied to extend the flow units, petrophysical properties and fractures into the inter-well area. Finally, uncertainty analysis is applied to combine different types of uncertainties for generating ensemble reservoir simulation models. The obtained clusters from different HCA methods are evaluated by the cophenetic coefficient, correlation coefficient, and variation coefficient, and the most appropriate clustering method is used to identify flow units for geostatistical modeling. We subsequently define uncertainties for static and dynamic properties such as permeability, porosity, net-to-gross, fracture, water-relative permeability, fluid properties, and rock compressibility. Discretized Latin Hypercube with Geostatistical (DLHG) method is applied to combine the defined uncertainties and create an ensemble of 200 simulation models which can span the uncertainty space. Eventually, a base production strategy is defined under operational conditions to check the consistency and reliability of the models created with UNISIM-II-R (reference model) as a real reservoir with known results. Results represent the compatibility of the methodology to characterize fractured reservoirs since those models are consistent with the reference model (used to generate the simulation models). The proposed workflow provides an efficient and useful means of supporting development planning under uncertainty

    Developing a workflow to represent fractured carbonate reservoirs for simulation models under uncertainties based on flow unit concept

    Get PDF
    Description of fractured reservoir rock under uncertainties in a 3D model and integration with reservoir simulation is still a challenging topic. In particular, mapping the potential zones with a reservoir quality can be very useful for making decisions and support development planning. This mapping can be done through the concept of flow units. In this paper, an integrated approach including a Hierarchical Cluster Analysis (HCA), geostatistical modeling and uncertainty analysis is developed and applied to a fractured carbonate in order to integrate on numerical simulation. The workflow begins with different HCA methods, performed to well-logs in three wells, to identify flow units and rock types. Geostatistical techniques are then applied to extend the flow units, petrophysical properties and fractures into the inter-well area. Finally, uncertainty analysis is applied to combine different types of uncertainties for generating ensemble reservoir simulation models. The obtained clusters from different HCA methods are evaluated by the cophenetic coefficient, correlation coefficient, and variation coefficient, and the most appropriate clustering method is used to identify flow units for geostatistical modeling. We subsequently define uncertainties for static and dynamic properties such as permeability, porosity, net-to-gross, fracture, water-relative permeability, fluid properties, and rock compressibility. Discretized Latin Hypercube with Geostatistical (DLHG) method is applied to combine the defined uncertainties and create an ensemble of 200 simulation models which can span the uncertainty space. Eventually, a base production strategy is defined under operational conditions to check the consistency and reliability of the models created with UNISIM-II-R (reference model) as a real reservoir with known results. Results represent the compatibility of the methodology to characterize fractured reservoirs since those models are consistent with the reference model (used to generate the simulation models). The proposed workflow provides an efficient and useful means of supporting development planning under uncertainty

    Estudo de um modelo composicional para simulação de reservatorios

    No full text
    Orientador: Antonio Claudio de França CorreaDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia de CampinasResumo: Este trabalho descreve um simulador composicional isotérmico, implícito para a pressão e explicito para as saturações e composições. Este tipo de modelo é usado na simulação do comportamento de reservatórios de gás condensado e óleo volátil na injeção de gás e CO2 e em estudos de deslocamentos miscíveis. A sua aplicação pode ser estendida para alguns problemas do tipo "black-oil". As equações são resolvidas por um método iterativo e seqüencial através do balanço de moles dos hidrocarbonetos e da água, com base em um modelo de Nghiem, L.X., Fong, D.K. e Aziz, K.. O equilíbrio e as propriedades das fases são estimadas por um "flash" usando uma equação de estado que pode ser tanto a de Peng-Robinson como a de Soave-Redlich-Kwong, dentre outras. Um cálculo "flash" eficiente permite o uso do modelo na região crítica. O modelo foi testado para vários casos e alguns deles são apresentados e discutidos. Devido à simplicidade, tempo computacional e desempenho, este trabalho pode ser um primeiro passo para o desenvolvimento de um "General Purpose Reservoir Simulator".Abstract: This work describes an implicit-pressure, explicit-composition and explicit-saturation isothermal compositional model. This type of model is required in the simulation of the behavior of as condensate an volatile oil reservoirs, in enhanced oil recovery reservoirs by CO2 or enriched gas injection and in miscible displacement. It can be used also for some black-oil problems. The equations are based on the molar balance of the iterative-sequential method, based on a model by Nghiem, L. X., Fong, D.K., and Aziz, K.. The compositional phase equilibrium and phase behavior properties are estimated by a equilibrium flash calculation using an equation of state (either Peng-Robinson or Soave-Redlich-Kwong). An efficient flash-calculation permits the use of the model around the critical point. The model had been tested against some conditions which are presented and discussed. This model, due to the simplicity to program, run-time and performance, can be useful as a first step for the development of a General Purpose Reservoir Simulator.MestradoMestre em Engenharia de Petróle

    A new parameterization method for data assimilation and uncertainty assessment for complex carbonate reservoir models based on cumulative distribution function

    No full text
    Data assimilation (also known as history matching) and uncertainty assessment is the process of conditioning reservoir models to dynamic data to improve its production forecast capacity. One of the main challenges of the process is the representation and updating of spatial properties in a geologically consistent way. The process is even more challenging for complex geological systems such as highly channeling reservoirs, fractured systems and super-K layered reservoirs. Therefore, mainly for highly heterogeneous reservoirs, a proper parameterization scheme is crucial to ensure an effective and consistent process. This paper presents a new approach based on cumulative distribution function (CDF) for parameterization of complex geological models focused on layered reservoir with the presence of high permeability zones (super-K). The main innovative aspect of this work is focused on a new sampling procedure based on a cut-off frequency. The proposed method is simple to implement and, at the same time, very robust. It is able to properly represent super-K distribution along the reservoir during the data assimilation process, obtaining good data matches and reducing the uncertainty in the production forecast. The new method, which preserves the prior characteristics of the model, was tested in a complex carbonate reservoir model (UNISIM-II-H benchmark case) built based on a combination of Brazilian Pre-salt characteristics and Ghawar field information available in the literature. Promising results, which indicate the robustness of the method, are shown183This work was conducted with the support of Petrobras (Grant Agreement No. 0050.0100204.16.9) and Energi Simulation within the ANP R&D tax as “commitment to research and development investments”. The authors are grateful for the support of the Center of Petroleum Studies (CEPETRO-UNICAMP/Brazil), the Department of Energy (DE-FEM-UNICAMP/Brazil) and Research Group in Reservoir Simulation and Management (UNISIM-UNICAMP/Brazil). In addition, a special thanks to CMG and Schlumberger Information Solutions for software license
    corecore