102 research outputs found
Neural networks in petroleum geology as interpretation tools
Abstract
Three examples of the use of neural networks in analyses of geologic data from hydrocarbon reservoirs are presented. All networks are trained with data originating from clastic reservoirs of Neogene age located in the Croatian part of the Pannonian Basin. Training always included similar reservoir variables, i.e. electric logs (resistivity, spontaneous potential) and lithology determined from cores or logs and described as sandstone or marl, with categorical values in intervals. Selected variables also include hydrocarbon saturation, also represented by a categorical variable, average reservoir porosity calculated from interpreted well logs, and seismic attributes. In all three neural models some of the mentioned inputs were used for analyzing data collected from three different oil fields in the Croatian part of the Pannonian Basin. It is shown that selection of geologically and physically linked variables play a key role in the process of network training, validating and processing. The aim of this study was to establish relationships between log-derived data, core data, and seismic attributes. Three case studies are described in this paper to illustrate the use of neural network prediction of sandstone-marl facies (Case Study # 1, Okoli Field), prediction of carbonate breccia porosity (Case Study # 2, Beničanci Field), and prediction of lithology and saturation (Case Study # 3, Kloštar Field). The results of these studies indicate that this method is capable of providing better understanding of some clastic Neogene reservoirs in the Croatian part of the Pannonian Basin
Recommended from our members
Modeling and simulating of reservoir operation using the artificial neural network, support vector regression, deep learning algorithm
Reservoirs and dams are vital human-built infrastructures that play essential roles in flood control, hydroelectric power generation, water supply, navigation, and other functions. The realization of those functions requires efficient reservoir operation, and the effective controls on the outflow from a reservoir or dam. Over the last decade, artificial intelligence (AI) techniques have become increasingly popular in the field of streamflow forecasts, reservoir operation planning and scheduling approaches. In this study, three AI models, namely, the backpropagation (BP) neural network, support vector regression (SVR) technique, and long short-term memory (LSTM) model, are employed to simulate reservoir operation at monthly, daily, and hourly time scales, using approximately 30 years of historical reservoir operation records. This study aims to summarize the influence of the parameter settings on model performance and to explore the applicability of the LSTM model to reservoir operation simulation. The results show the following: (1) for the BP neural network and LSTM model, the effects of the number of maximum iterations on model performance should be prioritized; for the SVR model, the simulation performance is directly related to the selection of the kernel function, and sigmoid and RBF kernel functions should be prioritized; (2) the BP neural network and SVR are suitable for the model to learn the operation rules of a reservoir from a small amount of data; and (3) the LSTM model is able to effectively reduce the time consumption and memory storage required by other AI models, and demonstrate good capability in simulating low-flow conditions and the outflow curve for the peak operation period
Catch-22s of reservoir computing
Reservoir Computing (RC) is a simple and efficient model-free framework for
forecasting the behavior of nonlinear dynamical systems from data. Here, we
show that there exist commonly-studied systems for which leading RC frameworks
struggle to learn the dynamics unless key information about the underlying
system is already known. We focus on the important problem of basin prediction
-- determining which attractor a system will converge to from its initial
conditions. First, we show that the predictions of standard RC models (echo
state networks) depend critically on warm-up time, requiring a warm-up
trajectory containing almost the entire transient in order to identify the
correct attractor. Accordingly, we turn to Next-Generation Reservoir Computing
(NGRC), an attractive variant of RC that requires negligible warm-up time. By
incorporating the exact nonlinearities in the original equations, we show that
NGRC can accurately reconstruct intricate and high-dimensional basins of
attraction, even with sparse training data (e.g., a single transient
trajectory). Yet, a tiny uncertainty in the exact nonlinearity can render
prediction accuracy no better than chance. Our results highlight the challenges
faced by data-driven methods in learning the dynamics of multistable systems
and suggest potential avenues to make these approaches more robust.Comment: Published version (slight change to the title due to journal policy).
Code at https://github.com/spcornelius/RCBasin
An Integrative Remote Sensing Application of Stacked Autoencoder for Atmospheric Correction and Cyanobacteria Estimation Using Hyperspectral Imagery
Hyperspectral image sensing can be used to effectively detect the distribution of harmful cyanobacteria. To accomplish this, physical- and/or model-based simulations have been conducted to perform an atmospheric correction (AC) and an estimation of pigments, including phycocyanin (PC) and chlorophyll-a (Chl-a), in cyanobacteria. However, such simulations were undesirable in certain cases, due to the difficulty of representing dynamically changing aerosol and water vapor in the atmosphere and the optical complexity of inland water. Thus, this study was focused on the development of a deep neural network model for AC and cyanobacteria estimation, without considering the physical formulation. The stacked autoencoder (SAE) network was adopted for the feature extraction and dimensionality reduction of hyperspectral imagery. The artificial neural network (ANN) and support vector regression (SVR) were sequentially applied to achieve AC and estimate cyanobacteria concentrations (i.e., SAE-ANN and SAE-SVR). Further, the ANN and SVR models without SAE were compared with SAE-ANN and SAE-SVR models for the performance evaluations. In terms of AC performance, both SAE-ANN and SAE-SVR displayed reasonable accuracy with the Nash???Sutcliffe efficiency (NSE) > 0.7. For PC and Chl-a estimation, the SAE-ANN model showed the best performance, by yielding NSE values > 0.79 and > 0.77, respectively. SAE, with fine tuning operators, improved the accuracy of the original ANN and SVR estimations, in terms of both AC and cyanobacteria estimation. This is primarily attributed to the high-level feature extraction of SAE, which can represent the spatial features of cyanobacteria. Therefore, this study demonstrated that the deep neural network has a strong potential to realize an integrative remote sensing application
Machine-learning-based prediction of oil recovery factor for experimental CO2-Foam chemical EOR: Implications for carbon utilization projects 2023/9/1
Enhanced oil recovery (EOR) using CO2 injection is promising with economic and environmental benefits as an active climate-change mitigation approach. Nevertheless, the low sweep efficiency of CO2 injection remains a challenge. CO2-foam injection has been proposed as a remedy, but its laboratory screening for specific reservoirs is costly and time-consuming. In this study, machine-learning models are employed to predict oil recovery factor (ORF) during CO2-foam flooding cost-effectively and accurately. Four models, including general regression neural network (GRNN), cascade forward neural network with Levenberg–Marquardt optimization (CFNN-LM), cascade forward neural network with Bayesian regularization (CFNN-BR), and extreme gradient boosting (XGBoost), are evaluated based on experimental data from previous studies. Results demonstrate that the GRNN model outperforms the others, with an overall mean absolute error of 0.059 and an R2 of 0.9999. The GRNN model's applicability domain is verified using a Williams plot, and an uncertainty analysis for CO2-foam flooding projects is conducted. The novelty of this study lies in developing a machine-learning-based approach that provides an accurate and cost-effective prediction of ORF in CO2-foam experiments. This approach has the potential to significantly reduce screening costs and time required for CO2-foam injection, making it a more viable carbon utilization and EOR strategy
The Significance of Machine Learning in Clinical Disease Diagnosis: A Review
The global need for effective disease diagnosis remains substantial, given
the complexities of various disease mechanisms and diverse patient symptoms. To
tackle these challenges, researchers, physicians, and patients are turning to
machine learning (ML), an artificial intelligence (AI) discipline, to develop
solutions. By leveraging sophisticated ML and AI methods, healthcare
stakeholders gain enhanced diagnostic and treatment capabilities. However,
there is a scarcity of research focused on ML algorithms for enhancing the
accuracy and computational efficiency. This research investigates the capacity
of machine learning algorithms to improve the transmission of heart rate data
in time series healthcare metrics, concentrating particularly on optimizing
accuracy and efficiency. By exploring various ML algorithms used in healthcare
applications, the review presents the latest trends and approaches in ML-based
disease diagnosis (MLBDD). The factors under consideration include the
algorithm utilized, the types of diseases targeted, the data types employed,
the applications, and the evaluation metrics. This review aims to shed light on
the prospects of ML in healthcare, particularly in disease diagnosis. By
analyzing the current literature, the study provides insights into
state-of-the-art methodologies and their performance metrics.Comment: 8 page
Drought index downscaling using AI-based ensemble technique and satellite data
This study introduces and validates an artificial intelligence (AI)–based downscaling method for Standardized Precipitation Indices (SPI) in the northwest of Iran, utilizing PERSSIAN-CDR data and MODIS-derived drought-dependent variables. The correlation between SPI and two drought-dependent variables at a spatial resolution of 0.25° from 2000 to 2015 served as the basis for predicting SPI values at a finer spatial resolution of 0.05° for the period spanning 2016 to 2021. Shallow AI models (Support Vector Regression, Adaptive Neural Fuzzy Inference System, Feedforward Neural Network) and the Long Short-Term Memory (LSTM) deep learning method are employed for downscaling, followed by an ensemble post-processing technique for shallow AI models. Validation against rain gauge data indicates that all methods improve SPI simulation compared to PERSIANN-CDR products. The ensemble technique excels by 20% and 25% in the training and test phases, respectively, achieving the mean Determination Coefficient (DC) score of 0.67 in the validation phase. Results suggest that the deep learning LSTM method is less suitable for limited observed data compared to ensemble techniques. Additionally, the proposed methodology successfully detects approximately 80% of drought conditions. Notably, SPI-6 outperforms other temporal scales. This study advances the understanding of AI-driven downscaling for SPI, emphasizing the efficacy of ensemble approaches and providing valuable insights for regions with limited observational data.</p
- …