462 research outputs found
Machine learning based adaptive soft sensor for flash point inference in a refinery realtime process
In industrial control processes, certain characteristics are sometimes difficult to measure by a physical sensor due to technical and/or economic limitations. This fact is especially true in the petrochemical industry. Some of those quantities are especially crucial for operators and process safety. This is the case for the automotive diesel Flash Point Temperature (FT). Traditional methods for FT estimation are based on the study of the empirical inference between flammability properties and the denoted target magnitude. The necessary measures are taken indirectly by samples from the process and analyzing them in the laboratory, this process implies time (can take hours from collection to flash temperature measurement) and thus make it very difficult for real-time monitorization, which in fact results in security and economical losses. This study defines a procedure based on Machine Learning modules that demonstrate the power of real-time monitorization over real data from an important international refinery. As input, easily measured values provided in real-time, such as temperature, pressure, and hydraulic flow are used and a benchmark of different regressive algorithms for FT estimation is presented. The study highlights the importance of sequencing preprocessing techniques for the correct inference of values. The implementation of adaptive learning strategies achieves considerable economic benefits in the productization of this soft sensor. The validity of the method is tested in the reality of a refinery. In addition, real-world industrial data sets tend to be unstable and volatile, and the data is often affected by noise, outliers, irrelevant or unnecessary features, and missing data. This contribution demonstrates with the inclusion of a new concept, called an adaptive soft sensor, the importance of the dynamic adaptation of the conformed schemes based on Machine Learning through their combination with feature selection, dimensional reduction, and signal processing techniques. The economic benefits of applying this soft sensor in the refinery's production plant and presented as potential semi-annual savings.This work has received funding support from the SPRI-Basque Gov-
ernment through the ELKARTEK program (OILTWIN project, ref. KK-
2020/00052)
Data-driven Soft Sensors in the Process Industry
In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work
Industry and Tertiary Sectors towards Clean Energy Transition
The clean energy transition is the transition from the use of nonrenewable energy sources to renewable sources and is part of the wider transition to sustainable economies through the use of renewable energy, the adoption of energy-saving measures, and sustainable development techniques. The clean energy transition is a long and complex process that will lead to an epochal change, and it will allow safeguarding the health of the environment in the long term. For its success, it necessitates contribution from everyone, from the individual citizen to large multinationals, passing through SMEs; national and international policies play a key role in paving the way to this process. This Special Issue is focused on technical, financial, and policy-related aspects linked to the transition of industrial and service sectors towards energy saving and decarbonization. These different aspects are interrelated and, as such, they have been analyzed with an interdisciplinary approach, for example, by combining economic and technical information. The collected papers focus on energy efficiency and clean-energy key technologies, renewable sources, energy management and monitoring systems, energy policies and regulations, and economic and financial aspects
On robust and adaptive soft sensors.
In process industries, there is a great demand for additional process information such as the product quality
level or the exact process state estimation. At the same time, there is a large amount of process data like temperatures, pressures, etc. measured and stored every moment. This data is mainly measured for process
control and monitoring purposes but its potential reaches far beyond these applications. The task of soft
sensors is the maximal exploitation of this potential by extracting and transforming the latent information
from the data into more useful process knowledge. Theoretically, achieving this goal should be straightforward
since the process data as well as the tools for soft sensor development in the form of computational learning methods, are both readily available. However, contrary to this evidence, there are still several obstacles which prevent soft sensors from broader application in the process industry. The identification of the sources of these obstacles and proposing a concept for dealing with them is the general purpose of this work. The proposed solution addressing the issues of current soft sensors is a conceptual architecture for the development of robust and adaptive soft sensing algorithms. The architecture reflects the results of two review studies that were conducted during this project. The first one focuses on the process industry aspects of soft sensor development and application. The main conclusions of this study are that soft sensor development is currently being done in a non-systematic, ad-hoc way which results in a large amount of manual work needed for their development and maintenance. It is also found that a large part of the issues can be
related to the process data upon which the soft sensors are built. The second review study dealt with the same topic but this time it was biased towards the machine learning viewpoint. The review focused on the identification of machine learning tools, which support the goals of this work. The machine learning concepts which are considered are: (i) general regression techniques for building of soft sensors; (ii) ensemble methods; (iii) local learning; (iv) meta-learning; and (v) concept drift detection and handling. The proposed architecture arranges the above techniques into a three-level hierarchy, where the actual prediction-making models operate at the bottom level. Their predictions are flexibly merged by applying ensemble methods at the next higher level. Finally from the top level, the underlying algorithm is managed by means of metalearning methods. The architecture has a modular structure that allows new pre-processing, predictive or
adaptation methods to be plugged in. Another important property of the architecture is that each of the levels can be equipped with adaptation mechanisms, which aim at prolonging the lifetime of the resulting soft sensors.
The relevance of the architecture is demonstrated by means of a complex soft sensing algorithm, which can be seen as its instance. This algorithm provides mechanisms for autonomous selection of data preprocessing and predictive methods and their parameters. It also includes five different adaptation mechanisms, some of which can be applied on a sample-by-sample basis without any requirement to store the on-line data. Other, more complex ones are started only on-demand if the performance of the soft sensor
drops below a defined level. The actual soft sensors are built by applying the soft sensing algorithm to three industrial data sets. The different application scenarios aim at the analysis of the fulfilment of the defined goals. It is shown that the soft sensors are able to follow changes in dynamic environment and keep a stable performance level by exploiting the implemented adaptation mechanisms. It is also demonstrated that, although the algorithm is rather complex, it can be applied to develop simple and transparent soft sensors. In another experiment,
the soft sensors are built without any manual model selection or parameter tuning, which demonstrates the
ability of the algorithm to reduce the effort required for soft sensor development. However, if desirable, the algorithm is at the same time very flexible and provides a number of parameters that can be manually optimised. Evidence of the ability of the algorithm to deploy soft sensors with minimal training data and as such to provide the possibility to save the time consuming and costly training data collection is also given in this work
Plant-Wide Diagnosis: Cause-and-Effect Analysis Using Process Connectivity and Directionality Information
Production plants used in modern process industry must produce products that meet stringent
environmental, quality and profitability constraints. In such integrated plants, non-linearity and
strong process dynamic interactions among process units complicate root-cause diagnosis of
plant-wide disturbances because disturbances may propagate to units at some distance away
from the primary source of the upset. Similarly, implemented advanced process control
strategies, backup and recovery systems, use of recycle streams and heat integration may
hamper detection and diagnostic efforts.
It is important to track down the root-cause of a plant-wide disturbance because once
corrective action is taken at the source, secondary propagated effects can be quickly eliminated
with minimum effort and reduced down time with the resultant positive impact on process
efficiency, productivity and profitability.
In order to diagnose the root-cause of disturbances that manifest plant-wide, it is crucial to
incorporate and utilize knowledge about the overall process topology or interrelated physical
structure of the plant, such as is contained in Piping and Instrumentation Diagrams (P&IDs).
Traditionally, process control engineers have intuitively referred to the physical structure of
the plant by visual inspection and manual tracing of fault propagation paths within the process
structures, such as the process drawings on printed P&IDs, in order to make logical
conclusions based on the results from data-driven analysis. This manual approach, however, is
prone to various sources of errors and can quickly become complicated in real processes.
The aim of this thesis, therefore, is to establish innovative techniques for the electronic
capture and manipulation of process schematic information from large plants such as
refineries in order to provide an automated means of diagnosing plant-wide performance
problems. This report also describes the design and implementation of a computer application
program that integrates: (i) process connectivity and directionality information from intelligent
P&IDs (ii) results from data-driven cause-and-effect analysis of process measurements and (iii)
process know-how to aid process control engineers and plant operators gain process insight.
This work explored process intelligent P&IDs, created with AVEVA® P&ID, a Computer
Aided Design (CAD) tool, and exported as an ISO 15926 compliant platform and vendor
independent text-based XML description of the plant. The XML output was processed by a
software tool developed in Microsoft® .NET environment in this research project to
computationally generate connectivity matrix that shows plant items and their connections.
The connectivity matrix produced can be exported to Excel® spreadsheet application as a basis
for other application and has served as precursor to other research work. The final version of
the developed software tool links statistical results of cause-and-effect analysis of process data
with the connectivity matrix to simplify and gain insights into the cause and effect analysis
using the connectivity information. Process knowhow and understanding is incorporated to
generate logical conclusions.
The thesis presents a case study in an atmospheric crude heating unit as an illustrative example
to drive home key concepts and also describes an industrial case study involving refinery
operations. In the industrial case study, in addition to confirming the root-cause candidate, the
developed software tool was set the task to determine the physical sequence of fault
propagation path within the plant.
This was then compared with the hypothesis about disturbance propagation sequence
generated by pure data-driven method. The results show a high degree of overlap which helps
to validate statistical data-driven technique and easily identify any spurious results from the
data-driven multivariable analysis. This significantly increase control engineers confidence in
data-driven method being used for root-cause diagnosis.
The thesis concludes with a discussion of the approach and presents ideas for further
development of the methods
Context-Enabled Visualization Strategies for Automation Enabled Human-in-the-loop Inspection Systems to Enhance the Situation Awareness of Windstorm Risk Engineers
Insurance loss prevention survey, specifically windstorm risk inspection survey is the process of investigating potential damages associated with a building or structure in the event of an extreme weather condition such as a hurricane or tornado. Traditionally, the risk inspection process is highly subjective and depends on the skills of the engineer performing it. This dissertation investigates the sensemaking process of risk engineers while performing risk inspection with special focus on various factors influencing it. This research then investigates how context-based visualizations strategies enhance the situation awareness and performance of windstorm risk engineers.
An initial study investigated the sensemaking process and situation awareness requirements of the windstorm risk engineers. The data frame theory of sensemaking was used as the framework to carry out this study. Ten windstorm risk engineers were interviewed, and the data collected were analyzed following an inductive thematic approach. The themes emerged from the data explained the sensemaking process of risk engineers, the process of making sense of contradicting information, importance of their experience level, internal and external biases influencing the inspection process, difficulty developing mental models, and potential technology interventions. More recently human in the loop systems such as drones have been used to improve the efficiency of windstorm risk inspection. This study provides recommendations to guide the design of such systems to support the sensemaking process and situation awareness of windstorm visual risk inspection.
The second study investigated the effect of context-based visualization strategies to enhance the situation awareness of the windstorm risk engineers. More specifically, the study investigated how different types of information contribute towards the three levels of situation awareness. Following a between subjects study design 65 civil/construction engineering students completed this study. A checklist based and predictive display based decision aids were tested and found to be effective in supporting the situation awareness requirements as well as performance of windstorm risk engineers. However, the predictive display only helped with certain tasks like understanding the interaction among different components on the rooftop. For remaining tasks, checklist alone was sufficient. Moreover, the decision aids did not place any additional cognitive demand on the participants. This study helped us understand the advantages and disadvantages of the decision aids tested.
The final study evaluated the transfer of training effect of the checklist and predictive display based decision aids. After one week of the previous study, participants completed a follow-up study without any decision aids. The performance and situation awareness of participants in the checklist and predictive display group did not change significantly from first trial to second trial. However, the performance and situation awareness of participants in the control condition improved significantly in the second trial. They attributed this to their exposure to SAGAT questionnaire in the first study. They knew what issues to look for and what tasks need to be completed in the simulation. The confounding effect of SAGAT questionnaires needs to be studied in future research efforts
On robust and adaptive soft sensors
In process industries, there is a great demand for additional process information such as the product quality level or the exact process state estimation. At the same time, there is a large amount of process data like temperatures, pressures, etc. measured and stored every moment. This data is mainly measured for process control and monitoring purposes but its potential reaches far beyond these applications. The task of soft sensors is the maximal exploitation of this potential by extracting and transforming the latent information from the data into more useful process knowledge. Theoretically, achieving this goal should be straightforward since the process data as well as the tools for soft sensor development in the form of computational learning methods, are both readily available. However, contrary to this evidence, there are still several obstacles which prevent soft sensors from broader application in the process industry. The identification of the sources of these obstacles and proposing a concept for dealing with them is the general purpose of this work. The proposed solution addressing the issues of current soft sensors is a conceptual architecture for the development of robust and adaptive soft sensing algorithms. The architecture reflects the results of two review studies that were conducted during this project. The first one focuses on the process industry aspects of soft sensor development and application. The main conclusions of this study are that soft sensor development is currently being done in a non-systematic, ad-hoc way which results in a large amount of manual work needed for their development and maintenance. It is also found that a large part of the issues can be related to the process data upon which the soft sensors are built. The second review study dealt with the same topic but this time it was biased towards the machine learning viewpoint. The review focused on the identification of machine learning tools, which support the goals of this work. The machine learning concepts which are considered are: (i) general regression techniques for building of soft sensors; (ii) ensemble methods; (iii) local learning; (iv) meta-learning; and (v) concept drift detection and handling. The proposed architecture arranges the above techniques into a three-level hierarchy, where the actual prediction-making models operate at the bottom level. Their predictions are flexibly merged by applying ensemble methods at the next higher level. Finally from the top level, the underlying algorithm is managed by means of metalearning methods. The architecture has a modular structure that allows new pre-processing, predictive or adaptation methods to be plugged in. Another important property of the architecture is that each of the levels can be equipped with adaptation mechanisms, which aim at prolonging the lifetime of the resulting soft sensors. The relevance of the architecture is demonstrated by means of a complex soft sensing algorithm, which can be seen as its instance. This algorithm provides mechanisms for autonomous selection of data preprocessing and predictive methods and their parameters. It also includes five different adaptation mechanisms, some of which can be applied on a sample-by-sample basis without any requirement to store the on-line data. Other, more complex ones are started only on-demand if the performance of the soft sensor drops below a defined level. The actual soft sensors are built by applying the soft sensing algorithm to three industrial data sets. The different application scenarios aim at the analysis of the fulfilment of the defined goals. It is shown that the soft sensors are able to follow changes in dynamic environment and keep a stable performance level by exploiting the implemented adaptation mechanisms. It is also demonstrated that, although the algorithm is rather complex, it can be applied to develop simple and transparent soft sensors. In another experiment, the soft sensors are built without any manual model selection or parameter tuning, which demonstrates the ability of the algorithm to reduce the effort required for soft sensor development. However, if desirable, the algorithm is at the same time very flexible and provides a number of parameters that can be manually optimised. Evidence of the ability of the algorithm to deploy soft sensors with minimal training data and as such to provide the possibility to save the time consuming and costly training data collection is also given in this work.EThOS - Electronic Theses Online ServiceGBUnited Kingdo
Recommended from our members
Improving process monitoring and modeling of batch-type plasma etching tools
Manufacturing equipments in semiconductor factories (fabs) provide abundant data and opportunities for data-driven process monitoring and modeling. In particular, virtual metrology (VM) is an active area of research. Traditional monitoring techniques using univariate statistical process control charts do not provide immediate feedback to quality excursions, hindering the implementation of fab-wide advanced process control initiatives. VM models or inferential sensors aim to bridge this gap by predicting of quality measurements instantaneously using tool fault detection and classification (FDC) sensor measurements. The existing research in the field of inferential sensor and VM has focused on comparing regressions algorithms to demonstrate their feasibility in various applications. However, two important areas, data pretreatment and post-deployment model maintenance, are usually neglected in these discussions. Since it is well known that the industrial data collected is of poor quality, and that the semiconductor processes undergo drifts and periodic disturbances, these two issues are the roadblocks in furthering the adoption of inferential sensors and VM models. In data pretreatment, batch data collected from FDC systems usually contain inconsistent trajectories of various durations. Most analysis techniques requires the data from all batches to be of same duration with similar trajectory patterns. These inconsistencies, if unresolved, will propagate into the developed model and cause challenges in interpreting the modeling results and degrade model performance. To address this issue, a Constrained selective Derivative Dynamic Time Warping (CsDTW) method was developed to perform automatic alignment of trajectories. CsDTW is designed to preserve the key features that characterizes each batch and can be solved efficiently in polynomial time. Variable selection after trajectory alignment is another topic that requires improvement. To this end, the proposed Moving Window Variable Importance in Projection (MW-VIP) method yields a more robust set of variables with demonstrably more long-term correlation with the predicted output. In model maintenance, model adaptation has been the standard solution for dealing with drifting processes. However, most case studies have already preprocessed the model update data offline. This is an implicit assumption that the adaptation data is free of faults and outliers, which is often not true for practical implementations. To this end, a moving window scheme using Total Projection to Latent Structure (T-PLS) decomposition screens incoming updates to separate the harmless process noise from the outliers that negatively affects the model. The integrated approach was demonstrated to be more robust. In addition, model adaptation is very inefficient when there are multiplicities in the process, multiplicities could occur due to process nonlinearity, switches in product grade, or different operating conditions. A growing structure multiple model system using local PLS and PCA models have been proposed to improve model performance around process conditions with multiplicity. The use of local PLS and PCA models allows the method to handle a much larger set of inputs and overcome several challenges in mixture model systems. In addition, fault detection sensitivities are also improved by using the multivariate monitoring statistics of these local PLS/PCA models. These proposed methods are tested on two plasma etch data sets provided by Texas Instruments. In addition, a proof of concept using virtual metrology in a controller performance assessment application was also tested.Chemical Engineerin
Knock: A Century of Research
Knock is one of the main limitations on increasing spark-ignition (SI) engine efficiency. This has been known for at least 100 years, and it is still the case today. Knock occurs when conditions ahead of the flame front in an SI engine result in one or more autoignition events in the end gas. The autoignition reaction rate is typically much higher than that of the flame-front propagation. This may lead to the creation of pressure waves in the combustion chamber and, hence, an undesirable noise that gives knock its name. The resulting increased mechanical and thermal loading on engine components may eventually lead to engine failure. Reducing the compression ratio lowers end-gas temperatures and pressures, reducing end-gas reactivity and, hence, mitigating knock. However, this has a detrimental effect on engine efficiency. Automotive companies must significantly reduce their fleet carbon dioxide (CO2) values in the coming years to meet targets resulting from the 2015 Paris Agreement. One path towards meeting these is through partial or full electrification of the powertrain. However, the vast majority of automobiles in the near future will still feature a gasoline-fueled SI engine; hence, improvements in combustion engine efficiency remain fundamental. As knock has been a key limitation for so long, there is a huge amount of literature on the subject. A number of reviews on knock have already been published, including in recent years. These generally concentrate on current understanding and status. The present work, in contrast, aims to track the progress of research on knock from the 1920s right through to the present day. It is hoped that this can be a useful reference for new and existing researchers of the subject and give further weight to occasionally neglected historical activity, which can still provide important insights today
- …