997 research outputs found

    Neuro-Simulation Tool for Enhanced Oil Recovery Screening and Reservoir Performance Prediction

    Get PDF
    Assessment of the suitable enhanced oil recovery method in an oilfield is one of the decisions which are made prior to the natural drive production mechanism. In some cases, having in-depth knowledge about reservoir’s rock, fluid properties, and equipment is needed as well as economic evaluation. Both putting such data into simulation and its related consequent processes are generally very time consuming and costly.  In order to reduce study cases, an appropriate tool is required for primary screening prior to any operations being performed, to which leads reduction of time in design of ether pilot section or production under field condition. In this research, two different and useful screening tools are presented through a graphical user interface. The output of just over 900 simulations and verified screening criteria tables were employed to design the mentioned tools. Moreover, by means of gathered data and development of artificial neural networks, two dissimilar screening tools for proper assessment of suitable enhanced oil recovery method were finally introduced. The first tool is about the screening of enhanced oil recovery process based on published tables/charts and the second one which is Neuro-Simulation tool, concerns economical evaluation of miscible and immiscible injection of carbon dioxide, nitrogen and natural gas into the reservoir. Both of designed tools are provided in the form of a graphical user interface by which the user, can perceive suitable method through plot of oil recovery graph during 20 years of production, costs of gas injection per produced barrel, cumulative oil production, and finally, design the most efficient scenario

    Estimating Ultimate Recovery in Shale Wells Based on Facts

    Get PDF
    Natural gas, as one of the nation\u27s major energy sources plays a vital role in the US energy mix. In recent years, the production from Shale has focused much attention on this source of hydrocarbon. As an essential step for the production planning, natural gas professionals estimate production and ultimate recovery (EUR) throughout the life of wells. The fluid production rate (q) usually varies as a function of rock properties, well, and completion design parameters. The variation associated with these parameters is a source of uncertainty in estimating the long term production for unconventional reservoirs.;A number of methodologies have been suggested to estimate the long term production of shale wells. Decline curve analysis is the most widely used methodology in the estimation of the future production profile. However, its results have been determined to be over optimistic.;Discrepancies between actual and estimated production values by Arps decline curves have been observed. This is dominant in low permeability reservoirs characterized by production over-estimation that is a consequence of large values of hyperbolic component (b-values higher than 1). A combination of Arps hyperbolic (in early time) and exponential decline (in later time) is employed to overcome this deficiency (production over estimation). This combination of Arps declines curves are referred to as Combined Decline Curves (CDC).;The major objective of this research is to condition the results of the CDC-EUR of shale wells to rock properties, well characteristics, and completion design parameters in a given shale asset. The first step of this study is CDC-EUR estimation using Arps combined decline curves. In order to have a more accurate (conservative) estimation, the hyperbolic curve will be switched to exponential decline during later time in the well\u27s life. Then, artificial intelligence will be employed to condition the CDC-EUR to rock properties, well characteristics, and completion design parameters.;The major rock properties that will be studied in this research as input parameters include porosity, total organic carbon, net thickness, and water saturation. Moreover, the effect of several design parameters, such as well trajectories, completion, and hydraulic fracturing variables on CDC-EUR will be investigated. This model will help natural gas professionals to have a better understanding of the effect of rock properties and design parameters on future gas production of shale

    Conditioning the Estimating Ultimate Recovery of Shale Wells to Reservoir and Completion Parameters

    Get PDF
    In the last years, gas production from shale has increased significantly in the United States. Therefore, many studies have been focused on shale formation in different areas such as fracturing, reservoir simulation, forecasting and so on. Forecasting production or estimating ultimate recovery (EUR) is considered to be one of the most important items in the production development planning. The certainty in EUR calculation is questionable because there are different parameters that impact production and consequently the EUR such as rock properties and well completion design.;Different methods to calculate EUR have been used in the industry. Traditionally, the decline curve analysis method by Arps (1945) was considered to be the best common tool for estimating ultimate recovery (EUR) and reserves. However, the Arps\u27 equations over estimate of reserves when they are applied to unconventional reservoirs (extremely low permeability formation). The reason is that Arps\u27 equations only work for Boundary Dominated Flow (BDF) decline. On the other hand, many research papers show that the production from the unconventional tight reservoirs is distinguished by an extended period of late transient flow, until reaching the boundary-dominated flow. To overcome these problems and improve the unconventional reservoir\u27s production forecast, researchers have developed new empirical methods which are being implemented in all flow regimes.;These new and traditional methods have been applied in this research to calculate the EUR for more than 200 shale wells. The results of EUR will be subjected to study and condition with rock properties, well characteristics and completion\u27s design parameters. The porosity, total organic carbon, net thickness and water saturation are the main rock properties that are considered in this research. Furthermore, the impact of different well design configurations (for instance, well trajectories, completion and hydraulic fracturing variable) on EUR will be inspected this study. In addition, it will be determined from this research whether reservoir or completion parameters have the most impact on EUR. This study will provide the natural gas professionals insight and clarification regarding the effects of rock properties and well design configurations on estimating the ultimate recovery for gas shale

    The Prospect of Electrical Enhanced Oil Recovery for Heavy Oil: A Review

    Get PDF
    This paper presents a review of electrical heating for the recovery of heavy oil which the work adopts methods used in the past and the prospects for crude oil recovery in the future. Heavy oil is one of the crude oils with API more than 22 which has the potential to overcome the current light oil crisis. However, high viscosity and density are challenges in heavy oil recovery. The method is often used to overcome these challenges by using thermal injection methods, but this method results in economic and environmental issues. The electrical heating method could be a solution to replace conventional thermal methods in which the methodology of electrical heating is to transfer heat into the reservoir due to increasing oil mobility. Because the temperature rises, it could help to reduce oil viscosity, then heavy oil will flow easily. The applications of electrical heating have been adopted in this paper where the prospects of electrical heating are carried out to be useful as guidelines of electrical heating. The challenge of electrical heating is the excessive heat will damage the formation that must be addressed in the prospect of electrical heating which must meet energy efficiency. The use of Artificial intelligence becomes a new technology to overcome problems that are often found in conventional thermal methods where this method could avoid steam breakthrough and excessive heat. Therefore, it becomes more efficient and could reduce costs

    A Data-Driven Smart Proxy Model for a Comprehensive Reservoir Simulation

    Get PDF
    The preferred common tool to estimate the performance of oil and gas fields under different production scenarios is numerical reservoir simulation. A comprehensive numerical reservoir model has tens of millions of grid blocks. The massive potential of existing numerical reservoir simulation models have gone unrealized because they are computationally expensive and time-consuming. Therefore, an effective alternative tool is required for fast and reliable decision making. To reduce the required computational time, proxy models have been developed. Traditional proxy models are either statistical or reduced-order models (ROM). They were developed to substitute complex numerical simulation with producing a representation of the system at a lower computational cost. However, there are shortcomings associated with these approaches when applied to complex systems.;In this study, a novel proxy-model approach is presented in order to overcome the computational size and the traditional proxy-model challenges. The smart proxy model presented is based on artificial intelligence and data-mining techniques. The objective of this study was to develop two types of smart proxy models at each grid block. The first smart proxy model was generated to identify dynamic reservoir properties (pressure and saturation). The other proxy model was created to determine the production profile of a well. The two smart proxy models can be coupled in order to examine field production performance under different operational and geological realization.;The field of study in this work is the SACROC unit. It is a depleted oil field located in Scurry County, Texas. The production history of this field began back in the late 1940s. Based on the long period of production and the different drive mechanisms employed throughout the fields exploitation, its performance history was divided into three phases in this study. Each phase was investigated and smart proxy models were applied to each.;To develop a smart proxy model, multiple reservoir simulation scenarios are designed for different operational constraints and geological realizations. The geological parameters along with the results from the designed simulation runs are collected to build the spatial-temporal database. The parameters in the database are studied and key performance indicators are measured to select the required data to build the smart proxy model. The smart proxy is trained, calibrated, and validated using a series of neural networks. To validate a smart proxy model, it is deployed to replicate a blind numerical simulation run.;The developed smart proxy models are capable of supplying reservoir properties along with production profiles very quickly (seconds) and with an acceptable range of error compared to numerical reservoir models

    Confirmation of TDM Capabilities in Modeling Compartmentalized WAG EOR

    Get PDF
    Data-Driven Reservoir Modeling (DDRM), commonly referred to as Top-Down Modeling (TDM), is a relatively new and cutting-edge approach to the traditional numerical reservoir modeling and simulation techniques. DDRM uses artificial intelligence and machine learning in tandem to construct full-field models using measured data instead of calculations that refer to equations derived from averaged values and type curves. TDM allows all of the measured data from a field to be combined and used towards generating predictions of the production on a well by well basis for a specific field. Due to TDM not using the traditional physics-based approach, it is subjected to a plethora of criticisms within the industry. Therefore, the purpose of this thesis is to confirm the capabilities of TDM versus data synthetically generated using a Numerical Reservoir Simulator (NRS). To do this, the fluid flow through porous media will be modeled via the use of a traditional NRS; this way, everything is known about the reservoir in question. The data generated will then be exported and used towards the construction of the TDM. To complete the proposed objectives of this thesis, an application will be used to aid in the development of a TDM. All of the data used in order to develop and history match the TDM will have been generated via the NRS; this is done to confirm the abilities of TDM forecasting existing wells behavior. Once the TDM has been constructed; the forecast data will be compared to that from the NRS to validate the ability of the TDM

    Enhanced oil recovery by nanoparticles flooding: From numerical modeling improvement to machine learning prediction

    Get PDF
    Nowadays, enhanced oil recovery using nanoparticles is considered an innovative approach to increase oil production. This paper focuses on predicting nanoparticles transport in porous media using machine learning techniques including random forest, gradient boosting regression, decision tree, and artificial neural networks. Due to the lack of data on nanoparticles transport in porous media, this work generates artificial datasets using a numerical model that are validated against experimental data from the literature. Six experiments with different nanoparticles types with various physical features are selected to validate the numerical model. Therefore, the researchers produce six datasets from the experiments and create an additional dataset by combining all other datasets. Also, data preprocessing, correlation, and features importance methods are investigated using the Scikit-learn library. Moreover, hyperparameters tuning are optimized using the GridSearchCV algorithm. The performance of predictive models is evaluated using the mean absolute error, the R-squared correlation, the mean squared error, and the root mean squared error. The results show that the decision tree model has the best performance and highest accuracy in one of the datasets. On the other hand, the random forest model has the lowest root mean squared error and highest R-squared values in the rest of the datasets, including the combined dataset.Cited as: Alwated, B., El-Amin, M.F. Enhanced oil recovery by nanoparticles flooding: From numerical modeling improvement to machine learning prediction. Advances in Geo-Energy Research, 2021, 5(3): 297-317, doi: 10.46690/ager.2021.03.0

    Statistical and deep learning methods for geoscience problems

    Get PDF
    Machine learning is the new frontier for technology development in geosciences and has developed extremely fast in the past decade. With the increased compute power provided by distributed computing and Graphics Processing Units (GPUs) and their exploitation provided by machine learning (ML) frameworks such as Keras, Pytorch, and Tensorflow, ML algorithms can now solve complex scientific problems. Although powerful, ML algorithms need to be applied to suitable problems conditioned for optimal results. For this reason ML algorithms require not only a deep understanding of the problem but also of the algorithm’s ability. In this dissertation, I show that Simple statistical techniques can often outperform ML-based models if applied correctly. In this dissertation, I show the success of deep learning in addressing two difficult problems. In the first application I use deep learning to auto-detect the leaks in a carbon capture project using pressure field data acquired from the DOE Cranfield site in Mississippi. I use the history of pressure, rates, and cumulative injection volumes to detect leaks as pressure anomaly. I use a different deep learning workflow to forecast high-energy electrons in Earth’s outer radiation belt using in situ measurements of different space weather parameters such as solar wind density and pressure. I focus on predicting electron fluxes of 2 MeV and higher energy and introduce the ensemble of deep learning models to further improve the results as compared to using a single deep learning architecture. I also show an example where a carefully constructed statistical approach, guided by the human interpreter, outperforms deep learning algorithms implemented by others. Here, the goal is to correlate multiple well logs across a survey area in order to map not only the thickness, but also to characterize the behavior of stacked gamma ray parasequence sets. Using tools including maximum likelihood estimation (MLE) and dynamic time warping (DTW) provides a means of generating quantitative maps of upward fining and upward coarsening across the oil field. The ultimate goal is to link such extensive well control with the spectral attribute signature of 3D seismic data volumes to provide a detailed maps of not only the depositional history, but also insight into lateral and vertical variation of mineralogy important to the effective completion of shale resource plays

    Production Allocation of Reservoir Layers using Data-Driven Reservoir Modeling

    Get PDF
    The pros of having a commingled layer scheme would be considered high with successful reservoir management. If not, the cons will impact the production drastically as unfortunate consequences may result in reservoir fluids communication, well integrity issues, and production termination. Although the plane requires optimizing production with minimal capital investments and operating expenses, it is an enormous challenge considering commingled layers frequent surveillance and workover requirements. As the value of information is a decision tool for the surveillance frequency, the oil industry often uses static assumptions as an economical replacement of dynamic measurements such as KH static modeling. However, the last is misleading for not considering the effect of dynamic attributes such as reservoir pressure and fluid properties. Simultaneously, the evolution of Artificial Intelligence (AI) and Machine Learning (ML) made the challenge of allocating commingled layers allocation possible since AI does not build assumptions based on static properties but rather pick the static and dynamic patterns associated with rock and fluid properties. Accordingly, AI and ML application was used in this research as a new approach for commingled layers allocation estimation, which is known technically as Top-Down Modeling (TDM). TDM features the entire acquired static and dynamic field measurements through Artificial Intelligence and Data Science that utilizes Machine Learning, Fuzzy and crisp Logic via Neural Networks to develop a reservoir model. TDM was tested on a synthetic heterogeneous reservoir model with three commingled layers across 63 wells in conjunction with multi-random comingling schemes throughout wells\u27 lifespan. As the static KH modeling proven ambiguous in picking the effect of reservoir pressure on production profile per layer, a high certainty TDM modeling was successfully achieved both horizontally and vertically on a layer basis which confirms the capability of TDM in allocating commingled layers production in terms of certainty, and operational cost
    corecore