96 research outputs found
A Machine Learning Approach for Big Data in Oil and Gas Pipelines
Abstract-Experienced pipeline operators utilize Magnetic Flux Leakage (MFL) sensors to probe oil and gas pipelines for the purpose of localizing and sizing different defect types. A large number of sensors is usually used to cover the targeted pipelines. The sensors are equally distributed around the circumference of the pipeline; and every three millimeters the sensors measure MFL signals. Thus, the collected raw data is so big that it makes the pipeline probing process difficult, exhausting and error-prone. Machine learning approaches such as neural networks have made it possible to effectively manage the complexity pertaining to big data and learn their intrinsic properties. We concentrate, in this work, on the applicability of artificial neural networks in defect depth estimation and present a detailed study of various network architectures. Discriminant features, which characterize different defect depth patterns, are first obtained from the raw data. Neural networks are then trained using these features. The Levenberg-Marquardt back-propagation learning algorithm is adopted in the training process, during which the weight and bias parameters of the networks are tuned to optimize their performances. Compared with the performance of pipeline inspection techniques reported by service providers such as GE and ROSEN, the results obtained using the method we proposed are promising. For instance, within ±10% error-tolerance range, the proposed approach yields an estimation accuracy at 86%, compared to only 80% reported by GE; and within ±15% error-tolerance range, it yields an estimation accuracy at 89% compared to 80% reported by ROSEN
An Adaptive Neuro-Fuzzy Inference System-Based Approach for Oil and Gas Pipeline Defect Depth Estimation
Abstract-To determine the severity of metal-loss defects in oil and gas pipelines, the depth of potential defects, along with their length, needs first to be estimated. For this purpose, pipeline engineers use intelligent Magnetic Flux Leakage (MFL) sensors that scan the metal pipelines and collect defect-related data. However, due to the huge amount of the collected MFL data, the defect depth estimation task is cumbersome, timeconsuming, and error-prone. In this paper, we propose an adaptive neuro-fuzzy inference system (ANFIS)-based approach to estimate defect depths from MFL signals. Depth-related features are first extracted from the MFL signals and then are used to train the neural network to tune the parameters of the membership functions of the fuzzy inference system. A hybrid learning algorithm that combines least-squares and back propagation gradient descent method is adopted. Moreover, to achieve an optimal performance by the proposed approach, highly-discriminant features are selected from the obtained features by using the weight-based support vector machine (SVM). Experimental work has shown that encouraging results are obtained. Within error-tolerance ranges of ±15%, ±20%, ±25%, and ±30%, the depth estimation accuracies obtained by the proposed technique are 80.39%, 87.75%, 91.18%, and 95.59%, respectively. Moreover, further improvement can be easily achieved by incorporating new and more discriminant features
Machine Learning Approach for Risk-Based Inspection Screening Assessment
Risk-based inspection (RBI) screening assessment is used to identify equipment that makes a significant contribution to the system's total risk of failure (RoF), so that the RBI detailed assessment can focus on analyzing higher-risk equipment. Due to its qualitative nature and high dependency on sound engineering judgment, screening assessment is vulnerable to human biases and errors, and thus subject to output variability and threatens the integrity of the assets. This paper attempts to tackle these challenges by utilizing a machine learning approach to conduct screening assessment. A case study using a dataset of RBI assessment for oil and gas production and processing units is provided, to illustrate the development of an intelligent system, based on a machine learning model for performing RBI screening assessment. The best performing model achieves accuracy and precision of 92.33% and 84.58%, respectively. A comparative analysis between the performance of the intelligent system and the conventional assessment is performed to examine the benefits of applying the machine learning approach in the RBI screening assessment. The result shows that the application of the machine learning approach potentially improves the quality of the conventional RBI screening assessment output by reducing output variability and increasing accuracy and precision.acceptedVersio
Pipeline leakage detection and characterisation with adaptive surrogate modelling using particle swarm optimisation.
Pipelines are often subject to leakage due to ageing, corrosion, and weld defects, and it is difficult to avoid as the sources of leakages are diverse. Several studies have demonstrated the applicability of the machine learning model for the timely prediction of pipeline leakage. However, most of these studies rely on a large training data set for training accurate models. The cost of collecting experimental data for model training is huge, while simulation data is computationally expensive and time-consuming. To tackle this problem, the present study proposes a novel data sampling optimisation method, named adaptive particle swarm optimisation (PSO) assisted surrogate model, which was used to train the machine learning models with a limited dataset and achieved good accuracy. The proposed model incorporates the population density of training data samples and model prediction fitness to determine new data samples for improved model fitting accuracy. The proposed method is applied to 3-D pipeline leakage detection and characterisation. The result shows that the predicted leak sizes and location match the actual leakage. The significance of this study is two-fold: the practical application allows for pipeline leak prediction with limited training samples and provides a general framework for computational efficiency improvement using adaptive surrogate modelling in various real-life applications
Investigations on corrosion monitor reliability, calibration, and coverage
Thickness loss due to internal corrosion and erosion is a critical issue in ferromagnetic steel structures that can cause catastrophic failures. Ultrasonic thickness gauges are widely used for the detection of wall thickness. Recently permanently installed ultrasonic sensors have become popular for the inspection of areas suspected to undergo wall thickness loss. However, these are limited by the high cost and requirement of coupling agents. To address these problems, a novel cost-effective, and smart corrosion monitor based on the magnetic eddy current technique is developed in this research. The performance and reliability of the monitor to track internal wall thickness loss is tested successfully through accelerated and real-life aging corrosion tests.
Due to the handling and safety issues associated with the powerful magnets in magnetic techniques, a particle swarm-based optimisation method is proposed and validated through two test cases. The results indicate that the area of the magnetic excitation circuit could be reduced by 38% without compromising the sensitivity.
The reliability of the corrosion monitor is improved by utilising the active redundancy approach to identify and isolate faults in sensors. A real-life aging test is conducted for eight months in an ambient environment through an accelerated corrosion setup. The results obtained from the two corrosion monitors confirm that the proposed corrosion monitor is reliable for tracking the thickness loss. The corrosion monitor is found to be stable against environmental variations.
A new in-situ calibration method based on zero-crossing frequency feature is introduced to evaluate the in-situ relative permeability. The thickness of the test specimen could be estimated with an accuracy of ± 0.6 mm.
The series of studies conducted in the project reveal that the magnetic corrosion monitor has the capability to detect and quantify uniform wall thickness loss reliably
Denoising autoencoder in damage detection of pipeline using guided ultrasonic wave
Pipeline condition monitoring is essential in critical sectors such as the petrochemical, nuclear and energy sectors. The guided ultrasonic wave (GUW) monitoring system is an available pipeline condition monitoring system that is gaining much attention owing to its portability, long coverage and high sensitivity to damage. However, environmental and operational conditions (EOCs) effects, especially temperature and random noise may generate unwanted peaks, which are falsely identified as damage. Attempts to deal with EOC effects have not solved the problem, especially for small damage cases (damage equal to or less than 5% cross sectional area loss (CSAL)). In this study, a new damage feature extraction method based on the residual reliability criterion (RRC) is proposed. The performance of the proposed method is measured using the established receiver operating characteristics (ROCs) performance evaluation method. The findings show that this method performs well, with an AUC value greater than 0.9, based on numerical model under 40 ? variations and 10% random noise level, and that the application of RRC is intuitively simple. To ensure the practicality of the method, a 6 metre long, 8 inches diameter experimental pipe model filled with liquid is used to form a GUW database of small damage under 30 ? variations by using Torsional T(0,1) excitation mode at 26 kHz centre frequency. However, the RRC underperformed when experimental data is used because the random noise generated by healthy and damaged signals interferes and generates high amplitude noise. Therefore, this study proposed a denoising autoencoder (DAE) neural network to deal with the effects of EOCs. A DAE decodes high-dimensional data into low-dimensional features and reconstructs the original data from these low-dimensional features. By providing GUW signals at a reference temperature, this structure forces the DAE to learn the essential features hidden within complex data. The proposed DAE showed perfect detection (AUC value of 1.0) using numerical model and performs well (AUC greater than 0.9) using experimental model in terms of small damage identification. Moreover, the proposed method showed superiority among other advanced EOC compensation techniques using both numerical and experimental models
A model development for reconstruction of three-dimensional defects based on MFL signals
Corrosion has been statistically placed as the primary cause for pipeline failures well beyond other factors. The inability to accurately size corrosion defects located in pipelines can result in erroneous integrity strategies with fatal consequences, even when appropriate inspection processes have been conducted. Underestimation or overestimation of the defect size causes on one hand pipeline failures and on the other unnecessary assessments. Several strategies for defect sizing based on MFL signals have been developed in recent years. However, the industry still urges for reliability improvements. The current thesis develops a model based on calibration curves for the reconstruction of defects, based on MFL signals. A thorough study of different parameters involved allows for the understanding of the relationships between defect dimensions and MFL signal features. The methodology of this research includes theoretical, numerical and experimental assessments resulting in the development of a reliable three-dimensional model. Calibration curves are reported for inner as well as for outer defect configuration. Such curves permit the accurate establishment of the defect length and depth by means of the signal duration and amplitude. The results of this study for a single defect can be further implemented in order to investigate the superposition of MFL signals coming from adjacent defects. The MFL signal superposition is demonstrated through simulations and experiments.Nach der Statistik ist Korrosion, neben anderen Faktoren, die primäre Ursache für Rohrleitungsversagen. Die Unfähigkeit, Korrosionsfehler in Rohrleitungen genau zu dimensionieren, kann zu fehlerhaften Integritätsstrategien mit fatalen Folgen führen, selbst wenn geeignete Prüfprozesse angewendet werden. Eine Unterschätzung oder Überschätzung der Fehlergröße führt einerseits zu Pipelineversagen und andererseits zu unnötigen Untersuchungen. In den letzten Jahren wurden verschiedene Strategien zur Fehlergrößenbestimmung basierend auf Signalen des magnetischen Streuflusses entwickelt. Die Industrie drängt jedoch weiterhin auf eine Verbesserung der Zuverlässigkeit durch diese Technik. In der vorliegenden Arbeit werden Kalibrierkurven für die Rekonstruktion von Fehlstellen basierend auf Signalen des magnetischen Streuflusses beschrieben. Eine gründliche Untersuchung der verschiedenen Einflussparameter ermöglicht die Beziehungen zwischen den Dimensionen der Fehlstellen und Signalmerkmalen des magnetischen Streuflusses zu verstehen. Die Methodik dieser Forschung umfasst theoretische, numerische und experimentelle Bewertungen, die zur Entwicklung eines zuverlässigen dreidimensionalen Modells führen. Kalibrierkurven werden sowohl für Innen- als auch für Außenfehler angegeben. Solche Kurven ermöglichen die genaue Ermittlung der Fehlstellenlänge und -tiefe anhand der Signallänge und -amplitude. Die Ergebnisse, die in dieser Studie für Einzelfehler gewonnen wurden können verwendet werden, um Untersuchungen an benachbarten Fehlstellen durchzuführen, bei denen sich die Signale des magnetischen Streuflusses überlagern
30th International Conference on Condition Monitoring and Diagnostic Engineering Management (COMADEM 2017)
Proceedings of COMADEM 201
- …