13,496 research outputs found

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Evaluation of a batch process by means of batch statistical process control and system identification

    Get PDF
    Batch processes play an important role in the production of high quality specialty chemicals. Examples include the production of polymers, pharmaceuticals and formulated products. In this master thesis, the study of transformation of materials, by batch distillation and mixing is studied. The study is done by means of batch statistical process control and system identification methods in order to build soft sensors that can predict product quality and end-point but also to use the batch trajectory features for early fault detection. In contrast to a continuous process, a batch process is a finite duration process, from initialization to completion. The physical state of the process is derived from measured variables, for example, temperatures, ressures and flows and comes from on-line measurements of the on-going process. Since there are many variables, in terms of inputs and outputs, multivariate data analysis is a suitable choice for extracting systematic information which is used to find a relationship among the variables but also to visualize the batch trajectories and deviations from normal batch evolution. The results suggest that the end point can be predicted during distillation and mixing and it seems like it is possible to separate normal batches from different batches by means of batch statistical process control strategies. However, estimating product purity during distillation was not possible due to limited variation in the output data. Instead, system identification methodologies were a better choice. Product quality after mixing was poorly estimated with system identification tools due to the lack of variability within the time average of the different variables used, but was better predicted with batch statistical process control

    Batch-to-batch iterative learning control of a fed-batch fermentation process

    Get PDF
    PhD ThesisRecently, iterative learning control (ILC) has been used in the run-to-run control of batch processes to directly update the control trajectory. The basic idea of ILC is to update the control trajectory for a new batch run using the information from previous batch runs so that the output trajectory converges asymptotically to the desired reference trajectory. The control policy updating is calculated using linearised models around the nominal reference process input and output trajectories. The linearised models are typically identified using multiple linear regression (MLR), partial least squares (PLS) regression, or principal component regression (PCR). ILC has been shown to be a promising method to address model-plant mismatches and unknown disturbances. This work presents several improvements of batch to batch ILC strategy with applications to a simulated fed-batch fermentation process. In order to enhance the reliability of ILC, model prediction confidence is incorporated in the ILC optimization objective function. As a result of the incorporation, wide model prediction confidence bounds are penalized in order to avoid unreliable control policy updating. This method has been proven to be very effective for selected model prediction confidence bounds penalty factors. In the attempt to further improve the performance of ILC, averaged reference trajectories and sliding window techniques were introduced. To reduce the influence of measurement noise, control policy is updated on the average input and output trajectories of the past a few batches instead of just the immediate previous batch. The linearised models are re-identified using a sliding window of past batches in that the earliest batch is removed with the newest batch added to the model identification data set. The effects of various parameters were investigated for MLR, PCR and PLS method. The technique significantly improves the control performance. In model based ILC the weighting matrices, Q and R, in the objective function have a significant impact on the control performance. Therefore, in the quest to exploit the potential of objective function, adaptive weighting parameters were attempted to study the performance of batch to batch ILC with updated models. Significant improvements in the stability of the performance for all the three methods were noticed. All the three techniques suggested have established improvements either in stability, reliability and/or convergence speed. To further investigate the versatility of ILC, the above mentioned techniques were combined and the results are discussed in this thesis

    Data-Driven Fault Detection and Reasoning for Industrial Monitoring

    Get PDF
    This open access book assesses the potential of data-driven methods in industrial process monitoring engineering. The process modeling, fault detection, classification, isolation, and reasoning are studied in detail. These methods can be used to improve the safety and reliability of industrial processes. Fault diagnosis, including fault detection and reasoning, has attracted engineers and scientists from various fields such as control, machinery, mathematics, and automation engineering. Combining the diagnosis algorithms and application cases, this book establishes a basic framework for this topic and implements various statistical analysis methods for process monitoring. This book is intended for senior undergraduate and graduate students who are interested in fault diagnosis technology, researchers investigating automation and industrial security, professional practitioners and engineers working on engineering modeling and data processing applications. This is an open access book

    Integration of Batch-to-Batch and Within Batch Control Techniques: Application to a Simulated Nylon-6,6Process

    Get PDF
    Using a simulated nylon-6,6 batch process, this work presents three batch control schemes, 1) within batch, 2) batch-to-batch, and 3) integrated batch-to-batch and within batch, as improvements over fixed-recipe operation alone for disturbance rejection. The control schemes were developed using process understanding gained through analysis of a historical database of easily measured batch profiles. Various concerns regarding development and implementation of each strategy were discussed. The strengths and weaknesses of each controller\u27s performance were discussed as well. The analysis method used focused on separating batch measurement variability into time-axis and magnitude-axis components. Partitioning the data in this way generated time and magnitude scale parameters that described the normal variability in the process. These scale parameters provided improved process understanding and formed the basis for the improved control schemes developed in this work. The within batch controller was a feedforward strategy that made mid-course recipe adjustments based on predicted deviation from target quality. The batch-to-batch controller utilized quality measurements to provide feedback adjustments to subsequent batches. The integrated control scheme utilized the predictive feedforward performance of the within batch controller tempered by the off-line feedback of the batch-to-batch controller in a cascade arrangement. The three control schemes were compared to fixed-recipe operation. All three provided significant improvement in quality control. The within batch controller resulted in a 91% reduction in mean squared target error (MSE) over fixed recipe operation. The batch-to-batch controller provided an 87% reduction in MSE. The integrated control scheme was found to be the most effective providing a 99% reduction in MSE over fixed-recipe operation

    Doctor of Philosophy

    Get PDF
    dissertationIn order to ensure high production yield of semiconductor devices, it is desirable to characterize intermediate progress towards the final product by using metrology tools to acquire relevant measurements after each sequential processing step. The metrology data are commonly used in feedback and feed-forward loops of Run-to-Run (R2R) controllers to improve process capability and optimize recipes from lot-to-lot or batch-to-batch. In this dissertation, we focus on two related issues. First, we propose a novel non-threaded R2R controller that utilizes all available metrology measurements, even when the data were acquired during prior runs that differed in their contexts from the current fabrication thread. The developed controller is the first known implementation of a non-threaded R2R control strategy that was successfully deployed in the high-volume production semiconductor fab. Its introduction improved the process capability by 8% compared with the traditional threaded R2R control and significantly reduced out of control (OOC) events at one of the most critical steps in NAND memory manufacturing. The second contribution demonstrates the value of developing virtual metrology (VM) estimators using the insight gained from multiphysics models. Unlike the traditional statistical regression techniques, which lead to linear models that depend on a linear combination of the available measurements, we develop VM models, the structure of which and the functional interdependence between their input and output variables are determined from the insight provided by the multiphysics describing the operation of the processing step for which the VM system is being developed. We demonstrate this approach for three different processes, and describe the superior performance of the developed VM systems after their first-of-a-kind deployment in a high-volume semiconductor manufacturing environment

    The Enhanced Definition and Control of Downstream Processing Operations

    Get PDF
    Monitoring product and contaminants is critically important at all stages of bioprocess operation, development and control. The availability of rapid measurements on product and key contaminants will yield a higher resolution of data points and will allow for more intelligent operation of a process and thereby enhance the definition and characterisation of a bioprocess. The need to control a bioseparation process is due to the variable nature of upstream conditions, process additives and sub-optimal performance of processing equipment which may lead to different requirements for the operating conditions either within batches or on batch to batch basis. Potential operations for downstream processing of intracellular proteins are the selective flocculation, packed bed and expanded bed chromatographic operations. These processes involve the removal of a large number of contaminants in a single dynamic step and hence are difficult unit operations to characterise and operate in an efficient and reproducible manner. In order to achieve rapid charactensation and control of these processes some form of rapid monitoring was required. A sampling and monitoring system for analysis of an enzyme produced intracellularly in S.cerevisiae, alcohol dehydrogenase (ADH), cell debris, protein and RNA contaminants has been constructed, with a measurement cycle time of 135 s. Both an extended Kalman filter and the Levenberg-Marquardt nonlinear least squares model parameter identification technique have been implemented for rapid process characterisation. Estimation of model parameters from at-line data enabled process performance predictions to be represented in an optimum graphical manner and the subsequent determination of ideal operating conditions in a feedback model based control configuration. The application of such a control strategy for the batch flocculation process yielded on average 92% accuracy in achieving optimum operating conditions. A structured and intelligent use of the at-line data would improve process characterisation in terms of speed and stability. It was demonstrated that rapid monitoring of the packed and expanded bed chromatographic operations yielded improved characterisation in terms of higher resolution data points, enabled real time process analysis and control of the load cycle. For the control of the expanded bed operation a predictive technique was applied to compensate for the large dead volume associated with this unit operation. The feedback control resulted in approximately 80% accurate breakthrough setpoint regulation
    corecore