27 research outputs found

    Reservoir characterisation using artificial bee colony optimisation

    Get PDF
    To obtain an accurate estimation of reservoir performance, the reservoir should be properly characterised. One of the main stages of reservoir characterisation is the calibration of rock property distributions with flow performance observation, which is known as history matching. The history matching procedure consists of three distinct steps: parameterisation, regularisation and optimisation. In this study, a Bayesian framework and a pilot-point approach for regularisation and parameterisation are used. The major focus of this paper is optimisation, which plays a crucial role in the reliability and quality of history matching. Several optimisation methods have been studied for his¬tory matching, including genetic algorithm (GA), ant colony, particle swarm (PS), Gauss-Newton, Levenberg-Marquardt and Limited-memory, Broyden-Fletcher-Goldfarb-Shanno. One of the most recent optimisation algorithms used in different fields is artificial bee colony (ABC). In this study, the application of ABC in history matching is investigated for the first time. ABC is derived from the intelligent foraging behaviour of honey bees. A colony of honey bees is comprised of employed bees, onlookers and scouts. Employed bees look for food sources based on their knowledge, onlookers make decisions for foraging using employed bees’ observations, and scouts search for food randomly. To investigate the application of ABC in history matching, its results for two different synthetic cases are compared with the outcomes of three different optimisation methods: real-valued GA, simulated annealing (SA), and pre-conditioned steepest descent. In the first case, history matching using ABC afforded a better result than GA and SA. ABC reached a lower fitness value in a reasonable number of evaluations, which indicates the performance and execution-time capability of the method. ABC did not appear as efficient as PSD in the first case. In the second case, SA and PDS did not perform acceptably. GA achieved a better result in comparison to SA and PSD, but its results were not as superior as ABC’s. ABC is not concerned with the shape of the landscape, that is, whether it is smooth or rugged. Since there is no precise information about the landscape shape of the history matching function, it can be concluded that by using ABC, there is a high chance of providing high-quality history matching and reservoir characterisation

    Uncertainty quantification using a multi-objectivised randomised maximum likelihood method

    No full text
    Extended abstract, Technical Programme- Tuesday Paper TU P2.06 see Zipped files at https://events.eage.org/-/media/files/events/2017/europe/paris-2017/tp/paris-2017-extended-abstracts_technical programme_tuesday.zip?la=enTo propagate uncertainty in reservoir production forecasts, it is typically required to sample a nonlinear and multimodal posterior density function. To do so, different techniques have been proposed and used, such as Markovian algorithms, data assimilation methods and randomised maximum likelihood (RML) method. Through several studies, it has been shown that the RML method provides a reasonable approximation of the posterior distribution, despite the fact that it does not have any rigorous theoretical foundation for nonlinear problems. In order to reduce the computation and also provide an extensive search for multimodal density functions, in this study, the RML method is proposed in a context of a multi-objective genetic algorithm in which each of the equations is considered as a separate objective function. The proposed technique was compared against a Metropolis-Hastings algorithm and an RML with a Levenberg-Marquardt minimiser, using IC-Fault model. The comparison showed that an acceptable set of samples for uncertainty quantification is obtained, and given the fact that the parallelisation of the algorithm is straightforward, it makes the proposed algorithm, efficient in terms of the total processing time.M. Sayyafzade

    Assessment of different model-management techniques in history matching problems for reservoir modelling

    No full text
    History matching is a computationally expensive inverse problem. The computation costs are dominantly associated with the optimisation step. Fitness approximation (proxy-modelling) approaches are common methods for reducing computational costs where the time-consuming original fitness function is substituted with an undemanding function known as approximation function (proxy). Almost all of the applied fitness approximation methods in history-matching problems use a similar approach called uncontrolled fitness approximation. It has been corroborated that the uncontrolled fitness approximation approach may mislead the optimisation direction to a wrong optimum point. To prevent this error, it is endorsed that the original function should be utilised along with the approximation function during the optimisation process. To make use of the original function efficiently, a modelmanagement (evolution-control) technique should be applied. There are three different techniques: individual-based, population- based, and adaptive. By using each of these techniques, a controlled fitness approximation approach is assembled, which benefits from online learning. In the first two techniques, the number of original function evaluations in each evolutioncontrol cycle is fixed and predefined, which may result in an inefficient model management. In the adaptive technique, the number is altered based on the fidelity of the approximation function. In this study, a specific adaptive technique is designed, based on heuristic fuzzy roles; then, for the first time, the applications of all the three techniques are investigated in history matching. To deliver an assessment between the four approaches (the uncontrolled approach and three controlled approaches), a framework is developed in which ECLIPSE-E100 is coupled with MATLAB; and an artificial neural network, a genetic algorithm— with a customised crossover—and a Latin hypercube sampling strategy are used as the proxy model, optimiser, and experimental design method, respectively. History matching is carried out using each of the four approaches for the PUNQS3 reservoir model, while the same amount of computation time was allowed for each of the approaches. The outcomes demonstrate that the uncontrolled approach cannot deliver reliable results in comparison with the controlled approaches, and among the controlled approaches, the developed adaptive technique is more efficient.M. Sayyafzadeh and M. Haghighihttp://www.appeaconference.com.au/2013

    (mu + Lambda) Evolution strategy algorithm in well placement, trajectory, control and joint optimisation

    No full text
    Available online 20 February 2019Field development optimisation is a critical task in the modern reservoir management processes. The optimum setting provides the best exploitation strategy and financial returns. However, finding such a setting is difficult due to the non-linearity between the reservoir response and the development strategy parameters. Therefore, growing attention is being paid to computer-assisted optimisation algorithms, due to their capabilities in handling optimisation problems with such complexities. In this paper, the performance of (μ + Λ) Evolution Strategy (ES) Algorithm is compared to Genetic Algorithm (GA), Particle Swarm Optimisation (PSO) and (μ, Λ) Covariance Matrix Adaptation Evolution Strategy (CMA-ES) using five different optimisation cases. The 1st and 2nd cases are well placement and trajectory optimisation, respectively, which have rough objective function surfaces and a small number of dimensions. The 3rd Case is well control optimisation with a small number of dimensions, while the 4th case is a high-dimensional control optimisation. Lastly, the 5th case is joint optimisation that includes the number of wells, type, trajectory, and control, which has a high dimensional rugged surface. The results show that the use of ES as the optimisation algorithm delivers promising results in all cases, except case 3. It converged to a higher NPV compared to the other algorithms with the same computational budget. The obtained solutions also outperformed the ones delivered by reservoir engineering judgments.Zaid Alrashdi, Mohammad Sayyafzade

    Infill well placement optimization in coal bed methane reservoirs using genetic algorithm

    No full text
    The unprecedented growth of coal bed methane drilling, expensive coal bed water treatment, and low gas rates urge the integration of petroleum engineering and optimization disciplines to meet production goals. An integrated framework is constructed to attain best-obtained optimal locations of infill wells in coal bed methane reservoirs. This framework consists of a flow simulator (ECLIPSE E100), an optimization method (genetic algorithm), and an economic objective function. The objective function is the net present value of the infill project based on an annual discount rate. Best obtained optimal well locations are attained using the integrated framework when net present value is maximized. In this study, a semi synthetic model is constructed based on the Tiffany unit coal bed data in the San Juan basin. The number of infill wells in reservoir resulting in peak production profit is selected as an optimum number of the infill drilling plan. Cost of water treatment and disposal is a key economical parameter which controls infill well locations across the reservoir. When cost of water treatment is low, infill wells are mostly located in virgin section of the reservoir where reservoir pressure is high and fracture porosity is low. Water content in fractures does not play a significant role on infill wells selection when water treatment and disposal is a cheap operation. When cost of water treatment is high, infill wells are mostly located on the transition section between virgin and depleted sections of the reservoir to minimize water production. © 2013 Elsevier Ltd. All rights reserved.Alireza Salmachi, Mohammad Sayyafzadeh, Manouchehr Haghigh

    Optimisation and economical evaluation of infill drilling in CSG reservoirs using a multi-objective genetic algorithm

    No full text
    Water production in the early life of Coal Seam Gas (CSG) recovery makes these reservoirs different from conventional gas reservoirs. Normally, a large amount of water is produced during the early production period, while the gas-rate is negligible. It is essential to drill infill wells in optimum locations to reduce the water production and increase the gas recovery. To optimise infill locations in a CSG reservoir, an integrated framework is developed to couple the reservoir flow simulator (ECLIPSE) and the genetic algorithm (GA) optimisation toolbox of (MATLAB). In this study, the desired objective function is the NPV of the infill drilling. To obtain the economics of the infill drilling project, the objective function is split into two objectives. The first objective is the gas income; the second objective is the cost associated with water production. The optimisation problem is then solved using the multi-objective solver. The economics of the infill drilling program is investigated for a case study constructed based on the available data from the Tiffany unit in San Juan basin when gas price and water treatment cost are variable. Best obtained optimal locations of 20 new wells in the reservoir are attained using this optimisation framework to maximise the profit of this project. The results indicate that when the gas price is less than 2/Mscf,theinfillplan,regardlessofthecostofwatertreatment,isnoteconomicalanddrillingadditionalwellscannotbeeconomicallyjustified.Whenthecostofwatertreatmentanddisposalincreasesfrom2/Mscf, the infill plan, regardless of the cost of water treatment, is not economical and drilling additional wells cannot be economically justified. When the cost of water treatment and disposal increases from 0.01/STB to $4/STB, the optimisation framework intelligently distributes the infill wells across the reservoir in a way that the total water production of infill wells is reduced by 26%. Simulation results also indicate that when water treatment is an expensive operation, lower water production is attained by placing the infill wells in depleted sections of the coal bed, close to the existing wells. When water treatment cost is low, however, infill wells are freely allocated in virgin sections of the coal bed, where both coal gas content and reservoir pressure are high.A. Salmachi, M. Sayyafzadeh and M. Haghighihttp://www.appeaconference.com.au/2013

    Regularization in history matching using multi-objective genetic algorithm and Bayesian framework

    No full text
    Document ID: 154544-MSAbstract The history matching procedure can be divided into three sections: decision variables definition, objective function formulation and optimization. The most widespread approach regarding objective function formulation is the Bayesian framework. A Bayesian framework allows the incorporation of prior knowledge into the objective function which acts as a regularization method. In this approach, objective function consists of two terms; likelihood and prior knowledge functions. In order to maximize posterior probability function, usually a summation of prior and likelihood functions is minimized in which the prior and observed data covariance matrixes relate these two functions. Inappropriate covariance matrixes can lead to an incorrect domination of one of the functions over the other one and accordingly result in a false optimum point. In this study, to decrease the chance of convergence into a false optimum point, due to inaccurate covariance matrixes, an application of multi-objective optimization in history matching is introduced while likelihood and prior functions are the two objective functions. By making use of Pareto optimization (multi-objective optimization), a set of solutions named the Pareto front is provided which consists of nondominated solutions. Hence, an inaccuracy in the covariance matrixes cannot allow one objective function to dominate over the other one. After providing the set of solutions, usually a number of solutions are taken out from the set based on postoptimzation trade-offs for uncertainty analysis purposes. For this study, a synthetic case is constructed and history matching is carried out with two different approaches; the conventional and the proposed approach. In order to compare the approaches, it is assumed that covariance matrix of the observed data is not exactly known. Then, history matching is carried out, using a single objective genetic algorithm with different covariance matrixes and also, using a multi-objective genetic algorithm. A comparison between the outcomes of the conventional approach and the proposed approach demonstrates that decisions can be made with more confidence using the proposed approach.Mohammad Sayyafzadeh, Manouchehr Haghighi, Jonathan N. Carte

    Accelerating CMA-ES in history matching problems using an ensemble of surrogates with generation-based management

    No full text
    Because of the quasi-gradient update embedded in CMA-ES algorithm, it can outperform most of the population-based algorithms, from a convergence speed standpoint. However, due to the computationally expensive fitness function associated with history matching, the reduction of function (simulation) calls can be favourable. In this study, an ensemble of surrogates (proxies) with generation-based model-management is proposed to reduce the number of simulation calls efficaciously. Since the fitness function is highly nonlinear, an ensemble of surrogates (Gaussian process) is utilised. The likelihood term is divided into multiple functions, and each is represented via a separate surrogate. This improved the response surface fitting. In generation-based management, a stochastically selected measure (surrogate or reservoir-simulation) should be used to evaluate all the individuals of each generation. CMA-ES requires ranking of the individuals to select the parents. Therefore, the generation-based model-management fits well in CMA-ES, as surrogates are normally better in ranking the individuals than approximating the fitness. History matching for a real problem with 59 variables and PUNQ-S3 with eight variables was conducted via a standard CMA-ES and the proposed surrogate-assisted CMA-ES. The results showed that up to 65% and 50% less simulation calls for case#1 and case#2 were required.M. Sayyafzadeh, R. Koochak and M. Barle

    A novel method to model water-flooding via transfer function approach

    No full text
    Abstract Water flooding is one of the most economical methods to increase oil recovery. To maximize oil recovery during water-flooding, it is essential to provide a forecast of reservoir performance. Hence, various methods are used to simulate reservoirs. Although grid-based simulation is the most common and accurate method, time-consuming computation and demand for large quantities of data restrict the use of this method. This study presents the development of a new method to predict the performance of water injection based on Transfer Functions (TF). This method is faster since it requires less data and the only requirements are injection and production rates. In this method, it is assumed that a reservoir consists of a combination of black boxes (TFs). The order and arrangement of the TFs are chosen based on the physical condition of the reservoir which is ascertained by checking several cases. The injection and production rates act as input and output signals to these black boxes, respectively. After analyzing input and output signals, unknown parameters of TFs are calculated. Then, it is possible to predict the reservoir performance. Different cases are employed to validate the derived model. The simulation results show a good agreement with those obtained from common grid-based simulators. In addition, we found out that the TF parameters depend on the characteristics and the pattern of different sections of the reservoir. This method is a rapid way to simulate water-flooding and could be a new window to the future of fast simulators. It enables prediction of the performance of water-flooding and optimization of oil production by testing different injection scenarios. The method also provides key parameters such as well connectivity.Mohammad Sayyafzadeh, Peyman Pourafshary, Fariborz Rashid

    Application of transfer functions to model water injection in hydrocarbon reservoir

    No full text
    Water flooding is one of the most economical methods to increase oil recovery. In order to improve the ultimate oil recovery during waterflooding, it is essential to provide an accurate forecast of reservoir performance. Hence, various methods have been utilized to simulate reservoirs. Although grid-based simulation is the most common and accurate method, time-consuming computation and the demand for large quantities of data restrict the use of this method. Sometimes, a quick overview of reservoir performance is sufficient or all required data are not accessible. Therefore, in this study a fast simulator is introduced to provide a quick overview with the minimum amount of data.A new method is presented to forecast the performance of water injection based on Transfer Function (TF). In this approach, it is assumed that a reservoir consists of a combination of TFs. The order and arrangement of TFs are chosen based on the physical conditions of the reservoir which are ascertained by examining several cases. The selected arrangement and orders can be extended to any other reservoirs. Injection and production rates act as input and output signals to these TFs, respectively. After analyzing input and output signals, the unknown parameters of TFs are calculated. Subsequently, it is possible to predict reservoir performance.Four different cases are employed to validate the derived equation. The results reveal a good agreement with those obtained from the common grid-based simulators. In addition, it has been demonstrated that the TF parameters depend on the characteristics and the pattern of different sections of the reservoir.This approach is a quick way to forecast waterflooding performance and can be a new window for the future of fast simulators. It provides the prediction with higher certainty in comparison with the other fast simulators. Furthermore, the only requirements for the method are injection and production rates. The analytical solution of the method enables its utilization in finding optimum rates for water injection in a short period of time. The method also presents some key parameters such as well connectivity. The use of the model is limited to situations when a rapid estimation is looked for and/or adequate data is not accessible. © 2011 Elsevier B.V.Mohammad Sayyafzadeh, Peyman Pourafshary, Manouchehr Haghighi and Fariborz Rashidihttp://www.journals.elsevier.com/journal-of-petroleum-science-and-engineering
    corecore