725 research outputs found

    Chapter Operationalizing Heterogeneous Data-Driven Process Models for Various Industrial Sectors through Microservice-Oriented Cloud-Based Architecture

    Get PDF
    Industrial performance optimization increasingly makes the use of various analytical data-driven models. In this context, modern machine learning capabilities to predict future production quality outcomes, model predictive control to better account for complex multivariable environments of process industry, Bayesian Networks enabling improved decision support systems for diagnostics and fault detection are some of the main examples to be named. The key challenge is to integrate these highly heterogeneous models in a holistic system, which would also be suitable for applications from the most different industries. Core elements of the underlying solution architecture constitute highly decoupled model microservices, ensuring the creation of largely customizable model runtime environments. Deployment of isolated user-space instances, called containers, further extends the overall possibilities to integrate heterogeneous models. Strong requirements on high availability, scalability, and security are satisfied through the application of cloud-based services. Tieto successfully applied the outlined approach during the participation in FUture DIrections for Process industry Optimization (FUDIPO), a project funded by the European Commission under the H2020 program, SPIRE-02-2016

    Malliprediktiivisen säätörakenteen hyödyntäminen jatkuvatoimisessa keittovaiheessa

    Get PDF
    Continuous cooking is an important part of pulping process. High quality product is essential to preserve competitiveness of the mill. Modern automation increase profitability by improving product quality and reducing expenses. The objective of this thesis was to familiarize myself with pulp process and to utilize model predictive control in continuous cooking application. As a result, performance and usability improves. A familiar MPC software was used for this complex process with long delays and strong interactions. The work was done as a part of a pilot project. The created MPC controller was tested successfully with the MPC software. The information received from this projects proba-bly increases the use of modern control technology in future projects

    Real-time observer model for Kraft wood digester.

    Get PDF
    Thesis (M.Sc.Eng.)-University of KwaZulu-Natal, 2005.At SAPPI-Tugela a continuous Kraft wood chip digester operates in EMCC mode (extended modified continuous cooking). Chips are initially exposed to a NaOH / Na2S liquor at high temperature in the top section. The chips move downward in plug flow passing circumferential screens used to draw liquor for various circulations. About midway down the spent black liquor is removed and the chips enter the cooler bottom section where some further reaction and washing occurs. Liquor level and chip level are maintained close to each other near the top. Chips require 8-12 hours to pass through the digester, depending on the chip feed rate. The key parameter of interest at the digester exit is the Kappa number, which is a measure of the extent of delignification which has occurred. Different board and paper products require different Kappa number pulp feed. (Final properties such as tensile, tear and bursting strengths will also depend on the way fibres have been modified in the digestion). The objective of this investigation is to predict the Kappa number of the product pulp in real-time, thus facilitating quicker reaction than the present dependence on laboratory analysis permits, possibly even allowing closed-loop control. The extent of delignification depends on liquor strength, temperature and exposure time, with final Kappa number also depending on the properties of the chip feed (wood type and moisture content). Compensation to maintain a steady Kappa number is made difficult by the long and varying residence time, and the fact that any changes apply to the whole profile held up in the digester. A number of static models for Kappa number prediction have been developed by previous workers, but these do not compare well with plant measurements. The collection of data from the Sappi-Tugela reactor, and the pulp quality reports, have been used to determine an efficient model. This step required a considerable data collection exercise, and similar results to the quality reports have been obtained using a simple linear model based on this data. The problem of model error is being reduced by arrangement as a Smith Predictor, in which the model is intermittently corrected by available laboratory analyses. At the same time, an interface was created, in order to synchronise measurement data for the chips presently leaving the reactor. In order to deal with the dead time, each parcel of chips entering the reactor is effectively tracked, and the changes in Kappa number integrated for reaction time under the varying conditions in transit. Knowing the present inventory of the reactor, this model can also be run forward in time as a predictive controller, to determine optimal control actions for maintenance of the target Kappa number

    Latest in modelling symposium - in honour of professor Pertti Koukkari's 65th birthday

    Get PDF

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Latest in modelling symposium - in honour of professor Pertti Koukkari's 65th birthday

    Get PDF

    Operationalizing Heterogeneous Data-Driven Process Models for Various Industrial Sectors through Microservice-Oriented Cloud-Based Architecture

    Get PDF
    Industrial performance optimization increasingly makes the use of various analytical data-driven models. In this context, modern machine learning capabilities to predict future production quality outcomes, model predictive control to better account for complex multivariable environments of process industry, Bayesian Networks enabling improved decision support systems for diagnostics and fault detection are some of the main examples to be named. The key challenge is to integrate these highly heterogeneous models in a holistic system, which would also be suitable for applications from the most different industries. Core elements of the underlying solution architecture constitute highly decoupled model microservices, ensuring the creation of largely customizable model runtime environments. Deployment of isolated user-space instances, called containers, further extends the overall possibilities to integrate heterogeneous models. Strong requirements on high availability, scalability, and security are satisfied through the application of cloud-based services. Tieto successfully applied the outlined approach during the participation in FUture DIrections for Process industry Optimization (FUDIPO), a project funded by the European Commission under the H2020 program, SPIRE-02-2016

    Implementation and performance analysis of a model-based controller on a batch pulp digester

    Get PDF
    The control of batch pulp digesters is hampered by insufficient measurements as well as nonlinearity and weak correlation between consecutive cooks. This makes a model-based approach to control attractive. Due to the age of the industry, many legacy controllers are in place on digesters around the world. The theoretical variance obtained by Monte Carlo modelling of a new controller is used as a benchmark for performance comparison between an old control system (S-factor) and a new model based controller developed by the University of Pretoria (the UP controller). This study covers the development of the controller, Monte Carlo modelling of the old and new controllers and in-situ testing of the UP controller on an operating digester. During Monte Carlo simulation, the UP controller outperformed the legacy controller, obtaining a theoretical overall variance of 3,07 (which will be used as the baseline for performance measurement) while also showing larger responses to tuning factors. The S-factor performed at 6,8 times the theoretical optimum variance during in situ testing, while the UP controller performed at 3,9 times the theoretical optimum (43% better than the S-factor controller). An average error 90% lower than that of the S-factor controller was obtained when using the UP controller. Additional benefits of the new controller include easy inclusion of new measurements and clear relations between the tuning parameters used and the conditions in the digester.Dissertation (MEng (Control))--University of Pretoria, 2003.Chemical Engineeringunrestricte

    Model-based optimization of a CompactCooking G2 digesting process stage

    Get PDF
    A CompactCooking™ G2 (Valmet) digesting system represents a challenging process stage to be optimized in the context of a kraft pulp mill. Its highly non-linear behavior due to liquor recycling and heat integration poses a barrier to traditional trial-and-error optimization conducted by physical lab-scale simulation. Hence, this thesis aims to design a solution based on numerical simulation and mathematical optimization, whose results can be directly applied on industrial-scale as computed optimal set-points for the supervisory control. Based on published, first-principles, pulp digester models, a customized dynamic model was developed in Matlab/Simulink to simulate a complete CompactCooking™ G2 stage. The process model is founded on Purdue wood reaction kinetics and Härkönen chips bed compaction models, and it seamlessly takes into account process characteristics mentioned above. The non-linear model was validated by comparison against historical data of an industrial unit (200 h), and then employed in the design of a steady-state optimizer for this process stage by means of linear programming. Simulation results showed very good agreement in terms of liquors residual alkali, weak black liquor solids, and blowline kappa, despite high uncertainty on disturbances data and model simplifications. However, simulated kappa showed higher sensitivity to temperature fluctuations than the plant signal, likely indicating the need for more detail when modelling heat transfer phenomena. As to the optimization goal, a base case scenario (plant steady-state) was identified from industrial data to attempt process economics optimization. The results showed a potential for increasing profit or reducing variable costs in at least 2 USD/ADt, which for a modern pulp mill represents annual benefits between 1 – 2 million USD depending on production rate and mill availability. Further, the simulation model showed remarkable results when used in a novel process analysis technique, called here simulated contribution, letting to explain the variability of blowline kappa in terms of multiple-time-scale process dynamics. In conclusion, a model-based optimization method has been successfully designed for the CompactCooking™ G2 system, and potential economic benefits should encourage industrial testing and further work to develop a real-time optimizer software technology
    corecore