51 research outputs found

    Integration of geostatistical realizations in data assimilation and reduction of uncertainty process using genetic algorithm combined with multi-start simulated annealing

    Get PDF
    International audienceThis paper introduces a new methodology, combining a Genetic Algorithm (GA) with multi-start simulated annealing to integrate Geostatistical Realizations (GR) in data assimilation and uncertainty reduction process. The proposed approach, named Genetic Algorithm with Multi-Start Simulated Annealing (GAMSSA), comprises two parts. The first part consists of running a GA several times, starting with certain number of geostatistical realizations, and the second part consists of running the Multi-Start Simulated Annealing with Geostatistical Realizations (MSSAGR). After each execution of GA, the best individuals of each generation are selected and used as starting point to the MSSAGR. To preserve the diversity of the geostatistical realizations, a rule is imposed to guarantee that a given realization is not repeated among the selected individuals from the GA. This ensures that each Simulated Annealing (SA) process starts from a different GR. Each SA process is responsible for local improvement of the best individuals by performing local perturbation in other reservoir properties such as relative permeability, water-oil contact, etc. The proposed methodology was applied to a complex benchmark case (UNISIM-I-H) based on the Namorado Field, located in the Campos Basin, Brazil, with 500 geostatistical realizations and other 22 attributes comprising relative permeability, oil-water contact, and rock compressibility. Comparisons with a conventional GA algorithm are also shown. The proposed method was able to find multiple solutions while preserving the diversity of the geostatistical realizations and the variability of the other attributes. The matched models found by the GAMSSA method provided more reliable forecasts when compared with the matched models found by the GA

    Flow Simulation Using Local Grid Refinements to Model Laminated Reservoirs

    Get PDF
    International audienceSuper-giant carbonate fields, such as Ghawar, in Saudi Arabia, and Lula, at the Brazilian pre-salt, show highly heterogeneous behavior that is linked to high permeability intervals in thin layers. This article applies Local Grid Refinements (LGR) integrated with upscaling procedures to improve the representation of highly laminated reservoirs in flow simulation by preserving the static properties and dynamic trends from geological model. This work was developed in five main steps: (1) define a conventional coarse grid, (2) define LGR in the conventional coarse grid according to super-k and well locations, (3) apply an upscaling procedure for all scenarios, (4) define LGR directly in the simulation model, without integrate geological trends in LGR and (5) compare the dynamic response for all cases. To check results and compare upscaling matches, was used the benchmark model UNISIM-II-R, a refined model based on a combination of Brazilian Pre-salt and Ghawar field information. The main results show that the upscaling of geological models for coarse grid with LGR in highly permeable thin layers provides a close dynamic representation of geological characterization compared to conventional coarse grid and LGR only near-wells. Pseudo-relative permeability curves should be considered for (a) conventional coarse grid or (b) LGR scenarios under dual-medium flow simulations as the upscaling of discrete fracture networks and dual-medium flow models presents several limitations. The conventional approach of LGR directly in simulation model, presents worse results than LGR integrated with upscaling procedures as the extrapolation of dynamic properties to the coarse block mismatch the dynamic behavior from geological characterization. This work suggests further improvements for results for upscaling procedures that mask the flow behavior in highly laminated reservoirs

    Evaluation of an uncertainty reduction methodology based on Iterative Sensitivity Analysis (ISA) applied to naturally fractured reservoirs

    Get PDF
    International audienceHistory matching for naturally fractured reservoirs is challenging because of the complexity of flow behavior in the fracture-matrix combination. Calibrating these models in a history-matching procedure normally requires integration with geostatistical techniques (Big Loop, where the history matching is integrated to reservoir modeling) for proper model characterization. In problems involving complex reservoir models, it is common to apply techniques such as sensitivity analysis to evaluate and identify most influential attributes to focus the efforts on what most impact the response. Conventional Sensitivity Analysis (CSA), in which a subset of attributes is fixed at a unique value, may over-reduce the search space so that it might not be properly explored. An alternative is an Iterative Sensitivity Analysis (ISA), in which CSA is applied multiple times throughout the iterations. ISA follows three main steps: (a) CSA identifies Group i of influential attributes (i = 1, 2, 3, …, n); (b) reduce uncertainty of Group i, with other attributes with fixed values; and (c) return to step (a) and repeat the process. Conducting CSA multiple times allows the identification of influential attributes hidden by the high uncertainty of the most influential attributes. In this work, we assess three methods: Method 1 – ISA, Method 2 – CSA, and Method 3 – without sensitivity analysis, i.e., varying all uncertain attributes (larger searching space). Results showed that the number of simulation runs for Method 1 dropped 24% compared to Method 3 and 12% to Method 2 to reach a similar matching quality of acceptable models. In other words, Method 1 reached a similar quality of results with fewer simulations. Therefore, ISA can perform as good as CSA demanding fewer simulations. All three methods identified the same five most influential attributes of the initial 18. Even with many uncertain attributes, only a small percentage is responsible for most of the variability of responses. Also, their identification is essential for efficient history matching. For the case presented in this work, few fracture attributes were responsible for most of the variability of the responses

    Petroleum reservoir uncertainty mitigation through the integration with production history matching

    Get PDF
    This paper presents a new methodology to deal with uncertainty mitigation using observed data, integrating the uncertainty analysis and the history matching processes. The proposed method is robust and easy to use, offering an alternative way to traditional history matching methodologies. The main characteristic of the methodology is the use of observed data as constraints to reduce the uncertainty of the reservoir parameters. The integration of uncertainty analysis with history matching naturally yields prediction under uncertainty. The workflow permits to establish a target range of uncertainty that characterize a confidence interval of the probabilistic distribution curves around the observed data. A complete workflow of the proposed methodology was carried out in a realistic model based on outcrop data and the impact of the uncertainty reduction in the production forecasting was evaluated. It was demonstrated that for complex cases, with a high number of uncertain attributes and several objective-function, the methodology can be applied in steps, beginning with a field analysis followed by regional and local (well level) analyses. The main contribution of this work is to provide an interesting way to quantify and to reduce uncertainties with the objective to generate reliable scenario-based models for consistent production prediction.147158Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq

    Evaluation of an uncertainty reduction methodology based on Iterative Sensitivity Analysis (ISA) applied to naturally fractured reservoirs

    Get PDF
    History matching for naturally fractured reservoirs is challenging because of the complexity of flow behavior in the fracture-matrix combination. Calibrating these models in a history-matching procedure normally requires integration with geostatistical techniques (Big Loop, where the history matching is integrated to reservoir modeling) for proper model characterization. In problems involving complex reservoir models, it is common to apply techniques such as sensitivity analysis to evaluate and identify most influential attributes to focus the efforts on what most impact the response. Conventional Sensitivity Analysis (CSA), in which a subset of attributes is fixed at a unique value, may over-reduce the search space so that it might not be properly explored. An alternative is an Iterative Sensitivity Analysis (ISA), in which CSA is applied multiple times throughout the iterations. ISA follows three main steps: (a) CSA identifies Group i of influential attributes (i = 1, 2, 3, …, n); (b) reduce uncertainty of Group i, with other attributes with fixed values; and (c) return to step (a) and repeat the process. Conducting CSA multiple times allows the identification of influential attributes hidden by the high uncertainty of the most influential attributes. In this work, we assess three methods: Method 1 – ISA, Method 2 – CSA, and Method 3 – without sensitivity analysis, i.e., varying all uncertain attributes (larger searching space). Results showed that the number of simulation runs for Method 1 dropped 24% compared to Method 3 and 12% to Method 2 to reach a similar matching quality of acceptable models. In other words, Method 1 reached a similar quality of results with fewer simulations. Therefore, ISA can perform as good as CSA demanding fewer simulations. All three methods identified the same five most influential attributes of the initial 18. Even with many uncertain attributes, only a small percentage is responsible for most of the variability of responses. Also, their identification is essential for efficient history matching. For the case presented in this work, few fracture attributes were responsible for most of the variability of the responses

    Tomografia computadorizada de raios-X como tecnica de ensaios não destrutivos de materiais

    No full text
    Orientadores: Antonio Celso Fonseca de Arruda, Roberto de Alencar LotufoDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia MecanicaResumo: A tomografia computadorizada de raios-x tem extrapolado a área médica e ganhado campo em inúmeras aplicações onde se faz necessária a interpretação (qualitativa e quantitativa) da estrutura interna de um material opaco, sem destruí-lo. O presente trabalho demonstra a aplicação desta técnica em ensaios não destrutivos, utilizando-se amostras de materiais e componentes de diferentes densidades e geometrias, com defeitos simulados e reais, ensaiados em tomógrafos médicos. Demonstra-se, também, o uso da técnica na caracterização de filtros eletroquímicos usados para remoção de agentes contaminantes (no caso, zinco) de efluentes industriais. Foram usadas técnicas de processamento digital de imagens (sistema Khoros) para a caracterização dos defeitos encontrados, através da medida do coeficiente de atenuação do material em regiões de interesse, e por meio do cálculo de parâmetros dimensionais tais como área e perímetro. Foram aplicadas operações de filtragem matemática para a correção do efeito de endurecimento de feixe, verificado nas imagens de materiais metálicos, sobretudo no alumínio. Devido à sua natureza qualitativa e quantitativa, a tomografia computadorizada de raios-x demonstrou ser uma ferramenta promissora em ensaios não destrutivos de materiais. Este trabalho demonstra e reforça sua aplicabilidade através do uso de processamento digital de imagensAbstract: X-ray computed tomography (XCT), originally developed for medical purposes is becoming increasingly applied to several applications where it is necessary the interpretation of the internal structure of an object nondestructively. The present work shows the application of this technique to nondestructive testing using materiaIs and components of different density and geometry, with simulated and real defects tested in a medical scanner. Characterization of electrochemical filters used to remove contamination agents (in this case, zinc) in industrial effluents, by XCT technique is also described. Digital image processing have been used (software Khoros) for defects characterization, determining the attenuation coefficient in regions of interest and measuring parameters like area and perimeter. Digital filtering operations have been applied for beam hardening correction in metallic materiaIs images, mainly aluminum. Due its qualitative and quantitative nature, XCT technique established to be a promising tool of nondestructive materiaIs evaluation and this work emphasizes its applicability through digital image processingMestradoMateriais e Processos de FabricaçãoMestre em Engenharia Mecânic

    Development of methods for evaluation of filtration process through numerical simulation and x-ray tomography

    No full text
    Orientador: Antonio Celso Fonseca de ArrudaTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia MecanicaResumo: O objetivo deste trabalho foi o desenvolvimento de metodologias para o estudo de filtros e processos de filtração, utilizando a tomografia de raios-X, técnicas de processamento digital de imagens e simulação numérica. Os ensaios convencionais utilizados pelos fabricantes são realizados através da medida de parâmetros, tais como pressão e vazão, em pontos localizados a montante e a jusante do filtro. Estes ensaios não são eficientes na caracterização de defeitos e não revelam como ocorre o processo de saturação no interior do filtro. Neste trabalho, a tomografia de raios-X foi utilizada para duas finalidades básicas: estudar a distribuição de contaminantes e analisar defeitos no interior de elementos filtrantes. Os modelos de filtração encontrados na literatura não levam em conta o efeito do acúmulo de contaminantes no desempenho do filtro ao longo do processo de filtração. Neste trabalho, foi desenvolvido um modelo acoplando-se as equações de filtração com as equações fenomenológicas (lei de Darcy e equação da continuidade), de tal forma que o acúmulo de partículas no meio poroso (filtro) fosse considerado na simulação do processo de filtração. Os resultados demonstraram que o processo de saturação no interior dos elementos analisados não ocorre de forma homogênea, ou seja, o acúmulo de partículas é predominante em determinadas regiões. Em geral, há formação de canais preferenciais e o espaço interno do filtro não é totalmente utilizado na captura de partículas. Demonstrou-se também que, em alguns casos, o comportamento do filtro não é coerente com a especificação do fabricante. Foi possível, por fim, a utilização dos dados experimentais, obtidos via tomografia, para a validação do modelo teórico desenvolvidoAbstract: The development of methods for analysis of filters and filtration process, using X-ray computerized tomography, digital image processing and numerical simulation, was the objective of this work. Tests conventionally used by filter manufacturers are made through the measurement ofparameters, such as flow and pressure, at upstream and downstream ofthe filter. These tests are not efficient in the characterization of defects and do not reveal how the saturation process occurs within the filter. In this work, X-ray computerized tomography was used for two basic purposes: to evaluate contaminant distribution and to detect defects within the interior of the filter elements. The filtration models found in the literature do not consider the effect of contaminant concentration on the filter efficiency during the filtration processo In this work, filtration equations and phenomenological equations (Darcy's law and continuity equation) were coupled and a model that takes into account the contaminant accumulation on the filter performance was developed. The results demonstrated that the saturation process within the analyzed filters is not homogeneous, that is, the accumulation of partic1es is predominant in some regions. Generally, there are preferential channels and the interior of the filter elements is not totally utilized. In some cases, it was also demonstrated that the filter behavior does not agree with the manufacturer specifications. Finally, it was possible to use the experimental data obtained with X-ray computerized tomography, in order to validate the theoretical developed modelDoutoradoMateriais e Processos de FabricaçãoDoutor em Engenharia Mecânic

    A new parameterization method for data assimilation and uncertainty assessment for complex carbonate reservoir models based on cumulative distribution function

    No full text
    Data assimilation (also known as history matching) and uncertainty assessment is the process of conditioning reservoir models to dynamic data to improve its production forecast capacity. One of the main challenges of the process is the representation and updating of spatial properties in a geologically consistent way. The process is even more challenging for complex geological systems such as highly channeling reservoirs, fractured systems and super-K layered reservoirs. Therefore, mainly for highly heterogeneous reservoirs, a proper parameterization scheme is crucial to ensure an effective and consistent process. This paper presents a new approach based on cumulative distribution function (CDF) for parameterization of complex geological models focused on layered reservoir with the presence of high permeability zones (super-K). The main innovative aspect of this work is focused on a new sampling procedure based on a cut-off frequency. The proposed method is simple to implement and, at the same time, very robust. It is able to properly represent super-K distribution along the reservoir during the data assimilation process, obtaining good data matches and reducing the uncertainty in the production forecast. The new method, which preserves the prior characteristics of the model, was tested in a complex carbonate reservoir model (UNISIM-II-H benchmark case) built based on a combination of Brazilian Pre-salt characteristics and Ghawar field information available in the literature. Promising results, which indicate the robustness of the method, are shown183This work was conducted with the support of Petrobras (Grant Agreement No. 0050.0100204.16.9) and Energi Simulation within the ANP R&D tax as “commitment to research and development investments”. The authors are grateful for the support of the Center of Petroleum Studies (CEPETRO-UNICAMP/Brazil), the Department of Energy (DE-FEM-UNICAMP/Brazil) and Research Group in Reservoir Simulation and Management (UNISIM-UNICAMP/Brazil). In addition, a special thanks to CMG and Schlumberger Information Solutions for software license

    A new approach with multiple realizations for image perturbation using co-simulation and probability perturbation method

    No full text
    History matching is an inverse problem with multiple possible answers. The petrophysical properties of a reservoir are highly uncertain because data points are scarce and widely scattered. Some methods reduce uncertainty in petrophysical characterization; however, they commonly use a single matched model as a reference, which may excessively reduce uncertainty. Choosing a single image may cause the model to converge to a local minimum, yielding less reliable history matching. This work improves on the history matching presented by Oliveira et al. ((2017a) J. Petrol. Sci. Eng. 153, 111–122) using a benchmark model (UNISIM-I-H based on the Namorado field in Brazil). We use a new approach for a Probability Perturbation Method and image perturbation using Co-Simulation. Instead of using a single image as the reference, a set of best images is used to increase variability in the properties of the reservoir model while matching production data with history data. This approach mitigates the risk of the potentially excessive reduction of uncertainties that can happen when using a single model. Our methodology also introduces a new objective function for water breakthrough, improving model quality because of the importance of matching the water breakthrough in the process. Our proposed methodology for image perturbation uses the UNISIM-I-H, which comprises 25 wells and has 11 years of history data. Our methodology made the process of calibration more effective than the history matching by Oliveira et al. ((2017a) J. Petrol. Sci. Eng. 153, 111–122). Cross-influence was minimized, making the history matching more objective and efficient, and consequently, the production forecasts more reliable
    corecore