806 research outputs found

    A BP Neural Network Model to Predict Reservior Parameters

    Get PDF
    This paper proposes an artificial neural network (ANN) method to calculate reservoir parameters. By improving the algorithm of BP neural network, convergence speed is enhanced and better result can be achieved. Practical applications prove that neural network technique is of significant importance for reservoir description

    Oil and Gas flow Anomaly Detection on offshore naturally flowing wells using Deep Neural Networks

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceThe Oil and Gas industry, as never before, faces multiple challenges. It is being impugned for being dirty, a pollutant, and hence the more demand for green alternatives. Nevertheless, the world still has to rely heavily on hydrocarbons, since it is the most traditional and stable source of energy, as opposed to extensively promoted hydro, solar or wind power. Major operators are challenged to produce the oil more efficiently, to counteract the newly arising energy sources, with less of a climate footprint, more scrutinized expenditure, thus facing high skepticism regarding its future. It has to become greener, and hence to act in a manner not required previously. While most of the tools used by the Hydrocarbon E&P industry is expensive and has been used for many years, it is paramount for the industry’s survival and prosperity to apply predictive maintenance technologies, that would foresee potential failures, making production safer, lowering downtime, increasing productivity and diminishing maintenance costs. Many efforts were applied in order to define the most accurate and effective predictive methods, however data scarcity affects the speed and capacity for further experimentations. Whilst it would be highly beneficial for the industry to invest in Artificial Intelligence, this research aims at exploring, in depth, the subject of Anomaly Detection, using the open public data from Petrobras, that was developed by experts. For this research the Deep Learning Neural Networks, such as Recurrent Neural Networks with LSTM and GRU backbones, were implemented for multi-class classification of undesirable events on naturally flowing wells. Further, several hyperparameter optimization tools were explored, mainly focusing on Genetic Algorithms as being the most advanced methods for such kind of tasks. The research concluded with the best performing algorithm with 2 stacked GRU and the following vector of hyperparameters weights: [1, 47, 40, 14], which stand for timestep 1, number of hidden units 47, number of epochs 40 and batch size 14, producing F1 equal to 0.97%. As the world faces many issues, one of which is the detrimental effect of heavy industries to the environment and as result adverse global climate change, this project is an attempt to contribute to the field of applying Artificial Intelligence in the Oil and Gas industry, with the intention to make it more efficient, transparent and sustainable

    Black shale lithofacies prediction and distribution Pattern analysis of middle Devonian Marcellus Shale in the Appalachian Basin, northeastern U.S.A.

    Get PDF
    The Marcellus Shale, marine organic-rich mudrock deposited during Middle Devonian in the Appalachian basin, is considered the largest unconventional shale-gas resource in United State. Although homogeneous in the appearance, the mudstone shows heterogeneity in mineral composition, organic matter richness, gas content, and fracture density. Two critical factors for unconventional mudstone reservoirs are units amenable to hydraulic fracture stimulation and rich of organic matter. The effectiveness of hydraulic fracture stimulation is influenced by rock geomechanical properties, which are related to rock mineralogy. The natural gas content in mudrock reservoirs has a strong relationship with organic matter, which is measured by total organic carbon (TOC). In place of using petrographic information and sedimentary structures, Marcellus Shale lithofacies were based on mineral composition and organic matter richness and were predicted by conventional logs to make the lithofacies \u27meaningful’, ‘predictable’ and ‘mappable’ at multiple scales from the well bore to basin. Core X-ray diffraction (XRD) and TOC data was used to classify Marcellus Shale into seven lithofacies according to three criteria: clay volume, the ratio of quartz to carbonate, and TOC. Pulsed neutron spectroscopy (PNS) logs provide similar mineral concentration and TOC content, and were used to classify shale lithofacies by the same three criteria. Artificial neural network (ANN) with improvements (i.e., learning algorithms, performance function and topology design) was utilized to predict Marcellus Shale lithofacies in 707 wells with conventional logs. To improve the effectiveness of wireline logs to predict lithofacies, the effects of barite and pyrite were partly removed and eight petrophysical parameters commonly used for a conventional reservoir analysis were derived from conventional logs by petrophysical analysis. These parameters were used as input to the ANN analysis. Geostatistical analysis was used to develop the experimental variogram models and vertical proportion of each lithofacies. Indictor kriging, truncated Gaussian simulation (TGS), and sequential indicator simulation (SIS) were compared, and SIS algorithm performed well for modeling Marcellus Shale lithofacies in three-dimensions. Controlled primarily by sediment dilution, organic matter productivity, and organic matter preservation/decomposition, Marcellus Shale lithofacies distribution was dominantly affected by the water depth and the distance to shoreline. The Marcellus Shale lithofacies with the greatest organic content and highest measure of brittleness is concentrated along a crescent shape region paralleling the inferred shelf and shoreline, showing shape of crescent paralleling with shoreline. The normalized average gas production rate from horizontal wells supported the proposed approach to modeling Marcellus Shale lithofacies. The proposed 3-D modeling approach may be helpful for (1) investigating the distribution of each lithofacies at a basin-scale; (2) developing a better understanding of the factors controlling the deposition and preservation of organic matter and the depositional model of marine organic-rich mudrock; (3) identifying organic-rich units and areas and brittle units and areas in shale-gas reservoirs; (4) assisting in the design of horizontal drilling trajectories and location of stimulation activity; and (5) providing input parameters for the simulation of gas flow and production in mudrock (e.g., porosity, permeability and fractures)

    An advanced computational intelligent framework to predict shear sonic velocity with application to mechanical rock classification

    Get PDF
    Shear sonic wave velocity (Vs) has a wide variety of implications, from reservoir management and development to geomechanical and geophysical studies. In the current study, two approaches were adopted to predict shear sonic wave velocities (Vs) from several petrophysical well logs, including gamma ray (GR), density (RHOB), neutron (NPHI), and compressional sonic wave velocity (Vp). For this purpose, five intelligent models of random forest (RF), extra tree (ET), Gaussian process regression (GPR), and the integration of adaptive neuro fuzzy inference system (ANFIS) with differential evolution (DE) and imperialist competitive algorithm (ICA) optimizers were implemented. In the first approach, the target was estimated based only on Vp, and the second scenario predicted Vs from the integration of Vp, GR, RHOB, and NPHI inputs. In each scenario, 8061 data points belonging to an oilfield located in the southwest of Iran were investigated. The ET model showed a lower average absolute percent relative error (AAPRE) compared to other models for both approaches. Considering the first approach in which the Vp was the only input, the obtained AAPRE values for RF, ET, GPR, ANFIS + DE, and ANFIS + ICA models are 1.54%, 1.34%, 1.54%, 1.56%, and 1.57%, respectively. In the second scenario, the achieved AAPRE values for RF, ET, GPR, ANFIS + DE, and ANFIS + ICA models are 1.25%, 1.03%, 1.16%, 1.63%, and 1.49%, respectively. The Williams plot proved the validity of both one-input and four-inputs ET model. Regarding the ET model constructed based on only one variable,Williams plot interestingly showed that all 8061 data points are valid data. Also, the outcome of the Leverage approach for the ET model designed with four inputs highlighted that there are only 240 "out of leverage" data sets. In addition, only 169 data are suspected. Also, the sensitivity analysis results typified that the Vp has a higher effect on the target parameter (Vs) than other implemented inputs. Overall, the second scenario demonstrated more satisfactory Vs predictions due to the lower obtained errors of its developed models. Finally, the two ET models with the linear regression model, which is of high interest to the industry, were applied to diagnose candidate layers along the formation for hydraulic fracturing. While the linear regression model fails to accurately trace variations of rock properties, the intelligent models successfully detect brittle intervals consistent with field measurements

    Storage Capacity Estimation of Commercial Scale Injection and Storage of CO2 in the Jacksonburg-Stringtown Oil Field, West Virginia

    Get PDF
    Geological capture, utilization and storage (CCUS) of carbon dioxide (CO2) in depleted oil and gas reservoirs is one method to reduce greenhouse gas emissions with enhanced oil recovery (EOR) and extending the life of the field. Therefore CCUS coupled with EOR is considered to be an economic approach to demonstration of commercial-scale injection and storage of anthropogenic CO2. Several critical issues should be taken into account prior to injecting large volumes of CO2, such as storage capacity, project duration and long-term containment. Reservoir characterization and 3D geological modeling are the best way to estimate the theoretical CO 2 storage capacity in mature oil fields. The Jacksonburg-Stringtown field, located in northwestern West Virginia, has produced over 22 million barrels of oil (MMBO) since 1895. The sandstone of the Late Devonian Gordon Stray is the primary reservoir.;The Upper Devonian fluvial sandstone reservoirs in Jacksonburg-Stringtown oil field, which has produced over 22 million barrels of oil since 1895, are an ideal candidate for CO2 sequestration coupled with EOR. Supercritical depth (\u3e2500 ft.), minimum miscible pressure (941 psi), favorable API gravity (46.5°) and good water flood response are indicators that facilitate CO 2-EOR operations. Moreover, Jacksonburg-Stringtown oil field is adjacent to a large concentration of CO2 sources located along the Ohio River that could potentially supply enough CO2 for sequestration and EOR without constructing new pipeline facilities.;Permeability evaluation is a critical parameter to understand the subsurface fluid flow and reservoir management for primary and enhanced hydrocarbon recovery and efficient carbon storage. In this study, a rapid, robust and cost-effective artificial neural network (ANN) model is constructed to predict permeability using the model\u27s strong ability to recognize the possible interrelationships between input and output variables. Two commonly available conventional well logs, gamma ray and bulk density, and three logs derived variables, the slope of GR, the slope of bulk density and Vsh were selected as input parameters and permeability was selected as desired output parameter to train and test an artificial neural network. The results indicate that the ANN model can be applied effectively in permeability prediction.;Porosity is another fundamental property that characterizes the storage capability of fluid and gas bearing formations in a reservoir. In this study, a support vector machine (SVM) with mixed kernels function (MKF) is utilized to construct the relationship between limited conventional well log suites and sparse core data. The input parameters for SVM model consist of core porosity values and the same log suite as ANN\u27s input parameters, and porosity is the desired output. Compared with results from the SVM model with a single kernel function, mixed kernel function based SVM model provide more accurate porosity prediction values.;Base on the well log analysis, four reservoir subunits within a marine-dominated estuarine depositional system are defined: barrier sand, central bay shale, tidal channels and fluvial channel subunits. A 3-D geological model, which is used to estimate theoretical CO2 sequestration capacity, is constructed with the integration of core data, wireline log data and geological background knowledge. Depending on the proposed 3-D geological model, the best regions for coupled CCUS-EOR are located in southern portions of the field, and the estimated CO2 theoretical storage capacity for Jacksonburg-Stringtown oil field vary between 24 to 383 million metric tons. The estimation results of CO2 sequestration and EOR potential indicate that the Jacksonburg-Stringtown oilfield has significant potential for CO2 storage and value-added EOR

    Assisted history matching using pattern recognition technology

    Get PDF
    Reservoir simulation and modeling is utilized throughout field development in different capacities. Sensitivity analysis, history matching, operations optimization and uncertainty assessment are the conventional analyses in full field model studies. Realistic modeling of the complexities of a reservoir requires a large number of grid blocks. As the complexity of a reservoir increases and consequently the number of grid blocks, so does the time required to accomplish the abovementioned tasks.;This study aims to examine the application of pattern recognition technologies to improve the time and efforts required for completing successful history matching projects. The pattern recognition capabilities of Artificial Intelligence and Data Mining (AI&DM;) techniques are used to develop a Surrogate Reservoir Model (SRM) and use it as the engine to drive the history matching process. SRM is a prototype of the full field reservoir simulation model that runs in fractions of a second. SRM is built using a small number of geological realizations.;To accomplish the objectives of this work, a three step process was envisioned:;• Part one, a proof of concept study: The goal of first step was to prove that SRM is able to substitute the reservoir simulation model in a history matching project. In this part, the history match was accomplished by tuning only one property (permeability) throughout the reservoir.;• Part two, a feasibility study: This step aimed to study the feasibility of SRM as an effective tool to solve a more complicated history matching problem, particularly when the degrees of uncertainty in the reservoir increase. Therefore, the number of uncertain reservoir properties increased to three properties (permeability, porosity, and thickness). The SRM was trained, calibrated, and validated using a few geological realizations of the base reservoir model. In order to complete an automated history matching workflow, the SRM was coupled with a global optimization algorithm called Differential Evolution (DE). DE optimization method is considered as a novel and robust optimization algorithm from the class of evolutionary algorithm methods.;• Part three, a real-life challenge: The final step was to apply the lessons learned in order to achieve the history match of a real-life problem. The goal of this part was to challenge the strength of SRM in a more complicated case study. Thus, a standard test reservoir model, known as PUNQ-S3 reservoir model in the petroleum engineering literature, was selected. The PUNQ-S3 reservoir model represents a small size industrial reservoir engineering model. This model has been formulated to test the ability of various methods in the history matching and uncertainty quantification. The surrogate reservoir model was developed using ten geological realizations of the model. The uncertain properties in this model are distributions of porosity, horizontal, and vertical permeability. Similar to the second part of this study, the DE optimization method was connected to the SRM to form an automated workflow in order to perform the history matching. This automated workflow is able to produce multiple realizations of the reservoir which match the past performance. The successful matches were utilized to quantify the uncertainty in the prediction of cumulative oil production.;The results of this study prove the ability of the surrogate reservoir models, as a fast and accurate tool, to address the practical issues of reservoir simulation models in the history matching workflow. Nevertheless, the achievements of this dissertation are not only aimed at the history matching procedure, but also benefit the other time-consuming operations in the reservoir management workflow (such as sensitivity analysis, production optimization, and uncertainty assessment)

    Neural network applications to reservoirs: Physics-based models and data models

    No full text
    International audienceEditoria

    Reservoir Simulation of the Volve Oil field using AI-based Top-Down Modeling Approach

    Get PDF
    With the rise of high-performance computers, numerical reservoir simulators became popular among engineers to evaluate reservoirs and develop the fields. However, this technology is still unable to fully model the reservoirs with commingled production and highly complex geology, especially when it comes to uncertainty qualification and sensitivity analysis where hundreds of runs are required. This dissertation aims to provide a successful case study of the history matching of a complex reservoir in the North Sea (Volve field). The proposed model relies only on the measured field variables such as well, formation, completion characteristics, production rates, and operational conditions while it stays away from interpretation and assumption. More than eight years of data from the Volve field was used to generate a comprehensive dataset, and then key parameters were extracted using fuzzy pattern recognition. A system of fully coupled artificial neural networks (feed-forward and LSTM networks) was used to train, calibrate and validate the model. The Artificial neural network enables us to extract hidden patterns in the field by learning from historical data. The model successfully history-matched the well-head pressure, well-head temperature, and production rates of all the wells through a completely automated process. The forecasting capability of the model has been verified through blind validation in time and space using data that the model has not seen before. In contrast to the numerical simulator, which is only a reservoir model, this technology is a coupled reservoir and well-bore model which is able to learn the fluid motion behavior in a complex porous media with fewer resources and higher speed. The efficiency of this approach makes it a suitable tool for uncertainty quantification when a large number of runs is required. The combination of artificial intelligence and domain expertise makes this technology more reliable and closer to reality by staying loyal to field measurements

    Machine Learning Assisted Framework for Advanced Subsurface Fracture Mapping and Well Interference Quantification

    Get PDF
    The oil and gas industry has historically spent significant amount of capital to acquire large volumes of analog and digital data often left unused due to lack of digital awareness. It has instead relied on individual expertise and numerical modelling for reservoir development, characterization, and simulation, which is extremely time consuming and expensive and inevitably invites significant human bias and error into the equation. One of the major questions that has significant impact in unconventional reservoir development (e.g., completion design, production, and well spacing optimization), CO2 sequestration in geological formations (e.g., well and reservoir integrity), and engineered geothermal systems (e.g., maximizing the fluid flow and capacity of the wells) is to be able to quantify and map the subsurface natural fracture systems. This needs to be done both locally, i.e., near the wellbore and generally in the scale of the wellpad, or region. In this study, the conventional near wellbore natural fracture mapping techniques is first discussed and integrated with more advanced technologies such as application of fiber optics, specifically Distributed Acoustic Sensing (DAS) and Distributed Strain Sensing (DSS), to upscale the fracture mapping in the region. Next, a physics-based automated machine learning (AutoML) workflow is developed that incorporates the advanced data acquisition system that collects high-resolution drilling acceleration data to infer the near well bore natural fracture intensities. The new AutoML workflow aims to minimize human bias and accelerate the near wellbore natural fracture mapping in real time. The new AutoML workflow shows great promise by reducing the fracture mapping time and cost by 10-fold and producing more accurate, robust, reproducible, and measurable results. Finally, to completely remove human intervention and consequently accelerate the process of fracture mapping while drilling, the application of computer vision and deep learning techniques in new workflows to automate the process of identifying natural fractures and other lithological features using borehole image logs were integrated. Different structures and workflows have been tested and two specific workflows are designed for this purpose. In the first workflow, the fracture footprints on actual acoustic image logs (i.e., full, or partial sigmoidal signatures with a range of amplitude and vertical and horizontal displacement) is detected and classified in different categories with varying success. The second workflow implements the actual amplitude values recorded by the borehole image log and the binary representation of the produced images to detect and quantify the major fractures and beddings. The first workflow is more detailed and capable of identifying different classes of fractures albeit computationally more expensive. The second workflow is faster in detecting the major fractures and beddings. In conclusion, regional subsurface natural fracture mapping technique using an integration of conventional logging, microseismic, and fiber optic data is presented. A new AutoML workflow designed and tested in a Marcellus Shale gas reservoir was used to predict near wellbore fracture intensities using high frequency drilling acceleration data. Two integrated workflows were designed and validated using 3 wells in Marcellus Shale to extract natural fractures from acoustic image logs and amplitude recordings obtained during logging while drilling. The new workflows have: i) minimized human bias in different aspects of fracture mapping from image log analysis to machine learning model selection and hyper parameter optimization; ii) generated and quantified more accurate fracture predictions using different score matrices; iii) decreased the time and cost of the fracture interpretation by tenfold, and iv) presented more robust and reproducible results
    • …
    corecore