44 research outputs found

    A review of hydrogen/rock/brine interaction: Implications for hydrogen geo-storage

    Get PDF
    Hydrogen (H2) is currently considered a clean fuel to decrease anthropogenic greenhouse gas emissions and will play a vital role in climate change mitigation. Nevertheless, one of the primary challenges of achieving a complete H2 economy is the large-scale storage of H2, which is unsafe on the surface because H2 is highly compressible, volatile, and flammable. Hydrogen storage in geological formations could be a potential solution to this problem because of the abundance of such formations and their high storage capacities. Wettability plays a critical role in the displacement of formation water and determines the containment safety, storage capacity, and amount of trapped H2 (or recovery factor). However, no comprehensive review article has been published explaining H2 wettability in geological conditions. Therefore, this review focuses on the influence of various parameters, such as salinity, temperature, pressure, surface roughness, and formation type, on wettability and, consequently, H2 storage. Significant gaps exist in the literature on understanding the effect of organic material on H2 storage capacity. Thus, this review summarizes recent advances in rock/H2/brine systems containing organic material in various geological reservoirs. The paper also presents influential parameters affecting H2 storage capacity and containment safety, including liquid–gas interfacial tension, rock–fluid interfacial tension, and adsorption. The paper aims to provide the scientific community with an expert opinion to understand the challenges of H2 storage and identify storage solutions. In addition, the essential differences between underground H2 storage (UHS), natural gas storage, and carbon dioxide geological storage are discussed, and the direction of future research is presented. Therefore, this review promotes thorough knowledge of UHS, provides guidance on operating large-scale UHS projects, encourages climate engineers to focus more on UHS research, and provides an overview of advanced technology. This review also inspires researchers in the field of climate change to give more credit to UHS studies

    Reservoir characterisation using artificial bee colony optimisation

    Get PDF
    To obtain an accurate estimation of reservoir performance, the reservoir should be properly characterised. One of the main stages of reservoir characterisation is the calibration of rock property distributions with flow performance observation, which is known as history matching. The history matching procedure consists of three distinct steps: parameterisation, regularisation and optimisation. In this study, a Bayesian framework and a pilot-point approach for regularisation and parameterisation are used. The major focus of this paper is optimisation, which plays a crucial role in the reliability and quality of history matching. Several optimisation methods have been studied for his¬tory matching, including genetic algorithm (GA), ant colony, particle swarm (PS), Gauss-Newton, Levenberg-Marquardt and Limited-memory, Broyden-Fletcher-Goldfarb-Shanno. One of the most recent optimisation algorithms used in different fields is artificial bee colony (ABC). In this study, the application of ABC in history matching is investigated for the first time. ABC is derived from the intelligent foraging behaviour of honey bees. A colony of honey bees is comprised of employed bees, onlookers and scouts. Employed bees look for food sources based on their knowledge, onlookers make decisions for foraging using employed bees’ observations, and scouts search for food randomly. To investigate the application of ABC in history matching, its results for two different synthetic cases are compared with the outcomes of three different optimisation methods: real-valued GA, simulated annealing (SA), and pre-conditioned steepest descent. In the first case, history matching using ABC afforded a better result than GA and SA. ABC reached a lower fitness value in a reasonable number of evaluations, which indicates the performance and execution-time capability of the method. ABC did not appear as efficient as PSD in the first case. In the second case, SA and PDS did not perform acceptably. GA achieved a better result in comparison to SA and PSD, but its results were not as superior as ABC’s. ABC is not concerned with the shape of the landscape, that is, whether it is smooth or rugged. Since there is no precise information about the landscape shape of the history matching function, it can be concluded that by using ABC, there is a high chance of providing high-quality history matching and reservoir characterisation

    OpenSRANE, a Flexible and Extensible Platform for Quantitative Risk Assessment of NaTech Events

    Get PDF
    The effects of natural hazards triggering technological disaster (NaTech) on a society, economy and the environment is a multi-disciplinary research topic. The novelty of the issue and the lack of a standard procedure for risk assessment of this category of incidents show the need for more research in this area. This article introduces OpenSRANE as an open-source, extensible, flexible and object-oriented software for calculating the quantitative risk of NaTech events in process plants. Implementing the software in the Python programming environment provides high flexibility for the modeling and evaluations desired by users. The possibility of implementing the modifications and developments to the existing software as needed by users allows them to add their desired algorithms, elements and models to it, if needed. The software is based on the Monte Carlo method, but it is possible to implement other algorithms and approaches to it. Object-oriented programming and separation of the different parts of the software can increase the readability of the program, allowing researchers in different disciplines to focus easily on studying or developing the desired part with minimal interference from other parts. The applicability of the software has been demonstrated in a case study as well as the ability of the software to calculate results such as the individual risk, scenarios that consider domino effects and physical effects

    Uncertainty reduction in reservoir characterisation through inverse modelling of dynamic data: an evolutionary computation approach.

    Get PDF
    Precise reservoir characterisation is the basis for reliable flow performance predictions and unequivocal decision making concerning field development. History matching is an indispensable phase of reservoir characterisation in which the flow performance history is integrated into the initially constructed reservoir model to reduce uncertainties. It is a computationally intensive nonlinear inverse problem and typically suffers from illposedness. Developing an efficient automatic history matching framework is the core goal of almost all studies on this subject. To overcome some of the existing challenges in history matching, this thesis introduces new techniques which are mostly based on evolutionary computation concepts. In order to examine the techniques, in the beginning, the foundations of an automatic history matching framework are developed in which a reservoir simulator (ECLIPSE) is coupled with a programming language (MATLAB). Then, the introduced methods along with a number of conventional methods are installed on the framework, and they are compared with each other using different case studies. Thus far, numerous optimisation algorithms have been studied for history matching problems to conduct the calibration step accurately and efficiently. In this thesis, the application of a recent-developed algorithm, artificial bee colony (ABC), is assessed, for the first time. It is compared with three conventional optimisers, Levenberg-Marquette, Genetic Algorithm, and Simulated Annealing, using a synthetic reservoir model. The comparison indicates that ABC can deliver better results and is not concerned with the landscape shape of problem. The most likely reason of its success is having a suitable balance between exploration and exploitation search capability. Of course, similar to all stochastic optimisers, its main drawbacks are computational expenses and being inefficient in high-dimensional problems. Fitness approximation (proxy-modelling) approaches are common methods for reducing computational costs. All of the applied fitness approximation methods in history-matching problems use a similar approach called uncontrolled fitness approximation. It has been corroborated that the uncontrolled fitness approximation approach may mislead the optimisation direction. To prevent this issue, a new fitness approximation is developed in that a model management (evolution-control) technique is included. The results of the controlled (proposed) approach are compared with the results of conventional one using a case study (PUNQ-S3 model). It is shown that the computation can be reduced up to 75% by the proposed method. The proxy-modelling methods should be applied when the problem is not high-dimensional. None of the current formats of the applied stochastic optimisers is capable of dealing with high-dimensional problems efficiently, and they should be applied in conjunction with a reparameterisation technique which causes modelling errors. On the other hand, gradient based optimisers may be trapped into a local minimum, due to the nonlinearity of the problem. In this thesis, an inventive stochastic algorithm is developed for high-dimensional problems based on wavelet image-fusion and evolutionary algorithm concepts. The developed algorithm is compared with six algorithms (genetic algorithm with a pilot point reparameterisation, BFGS with a zonation reparameterisation, BFGS with a spectral decomposition reparameterisation, artificial bee colony, genetic algorithm and BFGS in full-parameterisation) using two different case studies. It is interesting that the best results are obtained by the introduced method. Besides, it is well-known that achieving high-quality history matched models using any of the methods depends on the reliability of objective function formulation. The most widespread approach of formulation is Bayesian framework. Because of complexities in quantifying measurement, modelling and prior model reliability, the weighting factors in the objective function may have uncertainties. The influence of these uncertainties on the outcome of history matching is studied in this thesis, and an approach is developed based on Pareto optimisation (multi-objective genetic algorithm) to deal with this issue. The approach is compared with a conventional (random selection) one. The results confirm that a high amount of computation can be saved by the Pareto approach. In last part of this thesis, a new analytical simulator is developed using the transfer function approach. The developed method does not need the expensive history matching, and it can be used for occasions that a quick forecasting is sought and/or history matching of grid-based reservoir simulation is impractical. In the developed method, it is assumed a reservoir consists of a combination of TFs, and then the order and arrangement of TFs are chosen based on the physical conditions of the reservoir ascertained by examining several cases. The results reveal a good agreement with those obtained from the grid-based simulators. An additional piece of work is done in this thesis in which the optimal infill drilling plane is estimated for a coal seam gas reservoir (semi-synthetic model constructed based on the Tiffany unit in the San Juan basin) by the use of the developed framework in which the objective function and the decision variables are set to be the net present value, and the location of infill wells, respectively.Thesis (Ph.D.) -- University of Adelaide, Australian School of Petroleum, 201

    Uncertainty quantification using a multi-objectivised randomised maximum likelihood method

    No full text
    Extended abstract, Technical Programme- Tuesday Paper TU P2.06 see Zipped files at https://events.eage.org/-/media/files/events/2017/europe/paris-2017/tp/paris-2017-extended-abstracts_technical programme_tuesday.zip?la=enTo propagate uncertainty in reservoir production forecasts, it is typically required to sample a nonlinear and multimodal posterior density function. To do so, different techniques have been proposed and used, such as Markovian algorithms, data assimilation methods and randomised maximum likelihood (RML) method. Through several studies, it has been shown that the RML method provides a reasonable approximation of the posterior distribution, despite the fact that it does not have any rigorous theoretical foundation for nonlinear problems. In order to reduce the computation and also provide an extensive search for multimodal density functions, in this study, the RML method is proposed in a context of a multi-objective genetic algorithm in which each of the equations is considered as a separate objective function. The proposed technique was compared against a Metropolis-Hastings algorithm and an RML with a Levenberg-Marquardt minimiser, using IC-Fault model. The comparison showed that an acceptable set of samples for uncertainty quantification is obtained, and given the fact that the parallelisation of the algorithm is straightforward, it makes the proposed algorithm, efficient in terms of the total processing time.M. Sayyafzade

    Assessment of different model-management techniques in history matching problems for reservoir modelling

    No full text
    History matching is a computationally expensive inverse problem. The computation costs are dominantly associated with the optimisation step. Fitness approximation (proxy-modelling) approaches are common methods for reducing computational costs where the time-consuming original fitness function is substituted with an undemanding function known as approximation function (proxy). Almost all of the applied fitness approximation methods in history-matching problems use a similar approach called uncontrolled fitness approximation. It has been corroborated that the uncontrolled fitness approximation approach may mislead the optimisation direction to a wrong optimum point. To prevent this error, it is endorsed that the original function should be utilised along with the approximation function during the optimisation process. To make use of the original function efficiently, a modelmanagement (evolution-control) technique should be applied. There are three different techniques: individual-based, population- based, and adaptive. By using each of these techniques, a controlled fitness approximation approach is assembled, which benefits from online learning. In the first two techniques, the number of original function evaluations in each evolutioncontrol cycle is fixed and predefined, which may result in an inefficient model management. In the adaptive technique, the number is altered based on the fidelity of the approximation function. In this study, a specific adaptive technique is designed, based on heuristic fuzzy roles; then, for the first time, the applications of all the three techniques are investigated in history matching. To deliver an assessment between the four approaches (the uncontrolled approach and three controlled approaches), a framework is developed in which ECLIPSE-E100 is coupled with MATLAB; and an artificial neural network, a genetic algorithm— with a customised crossover—and a Latin hypercube sampling strategy are used as the proxy model, optimiser, and experimental design method, respectively. History matching is carried out using each of the four approaches for the PUNQS3 reservoir model, while the same amount of computation time was allowed for each of the approaches. The outcomes demonstrate that the uncontrolled approach cannot deliver reliable results in comparison with the controlled approaches, and among the controlled approaches, the developed adaptive technique is more efficient.M. Sayyafzadeh and M. Haghighihttp://www.appeaconference.com.au/2013

    (mu + Lambda) Evolution strategy algorithm in well placement, trajectory, control and joint optimisation

    No full text
    Available online 20 February 2019Field development optimisation is a critical task in the modern reservoir management processes. The optimum setting provides the best exploitation strategy and financial returns. However, finding such a setting is difficult due to the non-linearity between the reservoir response and the development strategy parameters. Therefore, growing attention is being paid to computer-assisted optimisation algorithms, due to their capabilities in handling optimisation problems with such complexities. In this paper, the performance of (μ + Λ) Evolution Strategy (ES) Algorithm is compared to Genetic Algorithm (GA), Particle Swarm Optimisation (PSO) and (μ, Λ) Covariance Matrix Adaptation Evolution Strategy (CMA-ES) using five different optimisation cases. The 1st and 2nd cases are well placement and trajectory optimisation, respectively, which have rough objective function surfaces and a small number of dimensions. The 3rd Case is well control optimisation with a small number of dimensions, while the 4th case is a high-dimensional control optimisation. Lastly, the 5th case is joint optimisation that includes the number of wells, type, trajectory, and control, which has a high dimensional rugged surface. The results show that the use of ES as the optimisation algorithm delivers promising results in all cases, except case 3. It converged to a higher NPV compared to the other algorithms with the same computational budget. The obtained solutions also outperformed the ones delivered by reservoir engineering judgments.Zaid Alrashdi, Mohammad Sayyafzade

    Infill well placement optimization in coal bed methane reservoirs using genetic algorithm

    No full text
    The unprecedented growth of coal bed methane drilling, expensive coal bed water treatment, and low gas rates urge the integration of petroleum engineering and optimization disciplines to meet production goals. An integrated framework is constructed to attain best-obtained optimal locations of infill wells in coal bed methane reservoirs. This framework consists of a flow simulator (ECLIPSE E100), an optimization method (genetic algorithm), and an economic objective function. The objective function is the net present value of the infill project based on an annual discount rate. Best obtained optimal well locations are attained using the integrated framework when net present value is maximized. In this study, a semi synthetic model is constructed based on the Tiffany unit coal bed data in the San Juan basin. The number of infill wells in reservoir resulting in peak production profit is selected as an optimum number of the infill drilling plan. Cost of water treatment and disposal is a key economical parameter which controls infill well locations across the reservoir. When cost of water treatment is low, infill wells are mostly located in virgin section of the reservoir where reservoir pressure is high and fracture porosity is low. Water content in fractures does not play a significant role on infill wells selection when water treatment and disposal is a cheap operation. When cost of water treatment is high, infill wells are mostly located on the transition section between virgin and depleted sections of the reservoir to minimize water production. © 2013 Elsevier Ltd. All rights reserved.Alireza Salmachi, Mohammad Sayyafzadeh, Manouchehr Haghigh

    Regularization in history matching using multi-objective genetic algorithm and Bayesian framework

    No full text
    Document ID: 154544-MSAbstract The history matching procedure can be divided into three sections: decision variables definition, objective function formulation and optimization. The most widespread approach regarding objective function formulation is the Bayesian framework. A Bayesian framework allows the incorporation of prior knowledge into the objective function which acts as a regularization method. In this approach, objective function consists of two terms; likelihood and prior knowledge functions. In order to maximize posterior probability function, usually a summation of prior and likelihood functions is minimized in which the prior and observed data covariance matrixes relate these two functions. Inappropriate covariance matrixes can lead to an incorrect domination of one of the functions over the other one and accordingly result in a false optimum point. In this study, to decrease the chance of convergence into a false optimum point, due to inaccurate covariance matrixes, an application of multi-objective optimization in history matching is introduced while likelihood and prior functions are the two objective functions. By making use of Pareto optimization (multi-objective optimization), a set of solutions named the Pareto front is provided which consists of nondominated solutions. Hence, an inaccuracy in the covariance matrixes cannot allow one objective function to dominate over the other one. After providing the set of solutions, usually a number of solutions are taken out from the set based on postoptimzation trade-offs for uncertainty analysis purposes. For this study, a synthetic case is constructed and history matching is carried out with two different approaches; the conventional and the proposed approach. In order to compare the approaches, it is assumed that covariance matrix of the observed data is not exactly known. Then, history matching is carried out, using a single objective genetic algorithm with different covariance matrixes and also, using a multi-objective genetic algorithm. A comparison between the outcomes of the conventional approach and the proposed approach demonstrates that decisions can be made with more confidence using the proposed approach.Mohammad Sayyafzadeh, Manouchehr Haghighi, Jonathan N. Carte

    Optimisation and economical evaluation of infill drilling in CSG reservoirs using a multi-objective genetic algorithm

    No full text
    Water production in the early life of Coal Seam Gas (CSG) recovery makes these reservoirs different from conventional gas reservoirs. Normally, a large amount of water is produced during the early production period, while the gas-rate is negligible. It is essential to drill infill wells in optimum locations to reduce the water production and increase the gas recovery. To optimise infill locations in a CSG reservoir, an integrated framework is developed to couple the reservoir flow simulator (ECLIPSE) and the genetic algorithm (GA) optimisation toolbox of (MATLAB). In this study, the desired objective function is the NPV of the infill drilling. To obtain the economics of the infill drilling project, the objective function is split into two objectives. The first objective is the gas income; the second objective is the cost associated with water production. The optimisation problem is then solved using the multi-objective solver. The economics of the infill drilling program is investigated for a case study constructed based on the available data from the Tiffany unit in San Juan basin when gas price and water treatment cost are variable. Best obtained optimal locations of 20 new wells in the reservoir are attained using this optimisation framework to maximise the profit of this project. The results indicate that when the gas price is less than 2/Mscf,theinfillplan,regardlessofthecostofwatertreatment,isnoteconomicalanddrillingadditionalwellscannotbeeconomicallyjustified.Whenthecostofwatertreatmentanddisposalincreasesfrom2/Mscf, the infill plan, regardless of the cost of water treatment, is not economical and drilling additional wells cannot be economically justified. When the cost of water treatment and disposal increases from 0.01/STB to $4/STB, the optimisation framework intelligently distributes the infill wells across the reservoir in a way that the total water production of infill wells is reduced by 26%. Simulation results also indicate that when water treatment is an expensive operation, lower water production is attained by placing the infill wells in depleted sections of the coal bed, close to the existing wells. When water treatment cost is low, however, infill wells are freely allocated in virgin sections of the coal bed, where both coal gas content and reservoir pressure are high.A. Salmachi, M. Sayyafzadeh and M. Haghighihttp://www.appeaconference.com.au/2013
    corecore