1,472 research outputs found

    Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes

    Get PDF
    I argue that data becomes temporarily interesting by itself to some self-improving, but computationally limited, subjective observer once he learns to predict or compress the data in a better way, thus making it subjectively simpler and more beautiful. Curiosity is the desire to create or discover more non-random, non-arbitrary, regular data that is novel and surprising not in the traditional sense of Boltzmann and Shannon but in the sense that it allows for compression progress because its regularity was not yet known. This drive maximizes interestingness, the first derivative of subjective beauty or compressibility, that is, the steepness of the learning curve. It motivates exploring infants, pure mathematicians, composers, artists, dancers, comedians, yourself, and (since 1990) artificial systems.Comment: 35 pages, 3 figures, based on KES 2008 keynote and ALT 2007 / DS 2007 joint invited lectur

    Predictive models for density correction factor of natural gas and comparison with standard methods

    Get PDF
    International audienceTwo intelligent-based models which do not require complete gas compositions are presented to estimate natural gas density correction factor using comprehensive datasets (nearly 60 000 instances) originating from the AGA8-DCM (Detail Characterization Method) standard: (1) NGDC-ANN model (Natural Gas Density Calculator based on Artificial Neural Network) and (2) AGA8-GCMD model (Gross Characterization Method Developed by applying genetic algorithm technique). In the suggested models, only five input variables (specific gravity at base condition, operating temperature and pressure and molar composition of CO2 and N2) are employed. The experimental datasets obtained from this work (68 instances) and literature (505 instances) are applied to validate the developed model showing a very good agreement between experimental and estimated data. Simplicity, improving accuracy and satisfactory results of the suggested models over a wide range of operational conditions show that these models would be excellent alternatives for the traditional standard methods, so that, the NGDC-ANN model prediction besides of its simplicity to use show the highest accuracy over a wide of operational range in comparison to similar models

    Parametric and predictive analysis of horizontal well configurations for coalbed methane reservoirs in Appalachian Basin

    Get PDF
    It has been a well-established fact that the Appalachian Basin represents a high potential region for the Coalbed Methane (CBM) production. The thin coal beds in the Appalachian basin are characterized by low porosity and permeability values. Due to highly complex reservoir characteristics, different drilling techniques have been developed in order to improve ultimate gas recovery in the shortest possible time. It has been claimed that horizontal drilling is the optimum completion technique used in this region to maximize methane recovery from coalbed reservoirs.;Horizontal wells are considered to be effective in the relatively thin, naturally fractured reservoirs that are characterized by permeability anisotropy. With today\u27s advanced drilling technology, the direction of a horizontal wellbore can be controlled, maximizing the gas production. The objective of this study is to review the various horizontal well configurations used for the recovery of coal bed methane. This study discusses different coalbed properties, and horizontal well patterns, that should be applied in different cases. In addition, the gas recovery and the flow rate associated to the drainage area for each pattern are discussed.;Various reservoir models with diversity of reservoir properties and different horizontal well configurations with various spacing between laterals have been investigated for the best possible gas recovery, using detailed sensitivity analysis, parametric study and intelligent modeling approach

    Monitoring the Integrity of CO2 Storage Sites Using Smart Field Technology

    Get PDF
    Capability of underground carbon dioxide storage to confine and sustain injected CO2 for a very long time is the main concern for geologic CO2 sequestration. If a leakage from a geological sink occurs, it is crucial to find the approximate amount and location of the leak in order to implement proper remediation activity.;An overwhelming majority of research and development for storage site monitoring has been concentrated on atmospheric, surface or near surface monitoring of the sequestered CO2. This study is different it aims to monitor the integrity of CO2 storage at the reservoir level. This work proposes developing in-situ CO2 Monitoring and Verification technology based on the implementation of Permanent Down-hole Gauges (PDG) or Smart Wells along with Artificial Intelligence and Data Mining (AI&DM). The technology attempts to identify the characteristic of the CO2 leakage by de-convolving the pressure signals collected at the Smart Well sites.;Citronelle field, a saline reservoir located in Mobile County (Alabama, US) was considered for this study. A reservoir simulation model for CO 2 sequestration in the Citronelle field was developed and history matched. The presence of the PDGs were considered in the reservoir model at the injection well and an observation well. High frequency pressure data from sensors were collected based on different synthetic CO2 leakage scenarios in the model. Due to complexity of the pressure signal behaviors, a Machine Learning based technique was introduced to build an Intelligent Leakage Detection System (ILDS).;The ILDS was able to detect leakage characteristics in a short time (less than a day) demonstrating high precision in quantifying leakage characteristics subject to complex rate behaviors. The performance of ILDS was examined under different conditions such as multiple well leakages, cap rock leakage, availability of an additional monitoring well, presence of pressure drift and noise in sensor and uncertainty in the reservoir model

    Adoption of machine learning in estimating compressibility factor for natural gas mixtures under high temperature and pressure applications

    Get PDF
    One of the essential properties of natural gas is its compressibility factor (z-factor), which is required for the efficient design of natural gas pipelines, storage facilities, gas well testing, gas reserve estimation, etc. Its importance has led to the development of several approaches involving new laboratory methods, equations of state (EOS), empirical correlations, and artificial intelligence for estimating gas compressibility factors. Most of the developed Z factor models have a limited range of applicability. They are unsuitable for predicting Z factors of highly pressurized gas reservoirs and natural gas systems with pseudo-reduced temperatures less than 1. Where such models exist, they are scarce and less accurate. In this study, three machine learning models, including the Gradient Boosted Decision Tree (GBDT), Support Vector Regression (SVR), and Radial Basis Function-Neural Network (RBF-NN), were developed for predicting the z-factor of natural gas mixtures with a range of Ppr and Tpr of 0–30 and 0.92–3.0, respectively. The results showed that the Gradient Boosted Decision Tree (GBDT) model outperformed other selected machine learning algorithms and published correlations. The proposed model gave a superior coefficient of determination (R2 score), and root mean square (RMSE) of 0.99962 and 0.01033, respectively. Also, the variation of the Z factors from the GBDT model with pseudo-reduced pressures at different pseudo-reduced temperatures using the isotherm plot was found to be adequate. Hence, the GBDT model in this study is a reliable method for predicting Z factors of natural gas mixtures with Ppr and Tpr of 0–30 and 0.92–3.0, respectively. The plot revealed that the GBDT model performed extremely well in predicting compressibility factor with an MAPE of about 1%. The findings of this study shows that the proposed intelligent model can be utilized in predicting the gas Z-factor

    Intelligent time-successive production modeling

    Get PDF
    A new framework is presented that uses production data history in order to build a field-wide performance prediction model. In this work artificial intelligence techniques and data driven modeling are utilized to perform a future production prediction for both synthetic and real field cases.;Production history is paired with geological information from the field to build large dataset containing the spatio-temporal dependencies amongst different wells. These spatio-temporal dependencies are addressed by information from Closest Offset Wells (COWs). This information includes geological characteristics (Spatial) and dynamic production data (Temporal) of all COWs.;Upon creation of the dataset, this framework calls for development of a series of single layer neural network, trained by back propagation algorithm. These networks are then fused together to form the Intelligent Time-Successive Production Modeling (ITSPM). Using only well log information along with production history of existing wells, this technique can provide performance predictions for new wells and initial hydrocarbon in place (IHIP) using a volumetric-geostatical method.;A synthetic oil reservoir is built and simulated using a commercial reservoir numerical simulation package. Production and well log data are extracted and converted to an all-inclusive dataset. Following the dataset generation several neural networks are trained and verified to predict different stages of production. ITSPM method is utilized to estimate the production profile for nine new wells in the reservoir. ITSPM is also applied to data from a real field. The field that is giant oil field in the Middle East includes more than 200 wells with forty years of production history. ITSPM\u27s production predictions of the four newest wells in this reservoir are compared to real production data

    Neural Network with Genetic Algorithm Prediction Model of Energy Consumption for Billing Integrity in Gas Pipeline

    Get PDF
    Along the development of oil and gas industry, missing data is one of the contributors that restrains in analyzing and processing data task in database. By monitoring and maintaining using metering system, the reliability and billing integrity can be ensured and trustworthy can be developed between distributors and customers. In this context, PETRONAS Gas Berhad (PGB) as a gas distributor and an existing system in Nur Metering Station, Kulim is held responsible to evaluate the energy consumption from the sales gas produced. The system is standalone that consists of measuring equipment including pressure transmitter and temperature transmitter, turbine meter, gas chromatography and flow computer but does not have any reference system to verify its integrity. Customers are being charge according to the amount of energy consumption calculated and any error in calculation will cause loss of profit to the company and affect PETRONAS’s business credibility. In the future, it is such a vital to have an ideal analysis in order to maintain the sustainability. In this paper, several techniques will be discuss and selected including neural network prediction model, least square vector regression and combination of either two methods mentioned before with genetic algorithm as preferable technique to indicate the missing data. The model that has been selected based on its evaluation will predict the missing data and compare it with the results of the existing metering system to ensure the reliability and accuracy of the system. The billing integrity between oil and gas company especially PETRONAS and the customers could be maintained and in the future if the project is expanded, it will have the potential of saving of millions of dollars to Malaysian oil and gas companies
    • …
    corecore