27 research outputs found

    Modeling and Simulation of Metallurgical Processes in Ironmaking and Steelmaking

    Get PDF
    In recent years, improving the sustainability of the steel industry and reducing its CO2 emissions has become a global focus. To achieve this goal, further process optimization in terms of energy and resource efficiency and the development of new processes and process routes are necessary. Modeling and simulation have established themselves as invaluable sources of information for otherwise unknown process parameters and as an alternative to plant trials that involves lower costs, risks, and time. Models also open up new possibilities for model-based control of metallurgical processes. This Special Issue focuses on recent advances in the modeling and simulation of unit processes in iron and steelmaking. It includes reviews on the fundamentals of modeling and simulation of metallurgical processes, as well as contributions from the areas of iron reduction/ironmaking, steelmaking via the primary and secondary route, and continuous casting

    Process Modeling in Pyrometallurgical Engineering

    Get PDF
    The Special Issue presents almost 40 papers on recent research in modeling of pyrometallurgical systems, including physical models, first-principles models, detailed CFD and DEM models as well as statistical models or models based on machine learning. The models cover the whole production chain from raw materials processing through the reduction and conversion unit processes to ladle treatment, casting, and rolling. The papers illustrate how models can be used for shedding light on complex and inaccessible processes characterized by high temperatures and hostile environment, in order to improve process performance, product quality, or yield and to reduce the requirements of virgin raw materials and to suppress harmful emissions

    A Novel Black Box Process Quality Optimization Approach based on Hit Rate

    Full text link
    Hit rate is a key performance metric in predicting process product quality in integrated industrial processes. It represents the percentage of products accepted by downstream processes within a controlled range of quality. However, optimizing hit rate is a non-convex and challenging problem. To address this issue, we propose a data-driven quasi-convex approach that combines factorial hidden Markov models, multitask elastic net, and quasi-convex optimization. Our approach converts the original non-convex problem into a set of convex feasible problems, achieving an optimal hit rate. We verify the convex optimization property and quasi-convex frontier through Monte Carlo simulations and real-world experiments in steel production. Results demonstrate that our approach outperforms classical models, improving hit rates by at least 41.11% and 31.01% on two real datasets. Furthermore, the quasi-convex frontier provides a reference explanation and visualization for the deterioration of solutions obtained by conventional models

    Towards A Computational Intelligence Framework in Steel Product Quality and Cost Control

    Get PDF
    Steel is a fundamental raw material for all industries. It can be widely used in vari-ous fields, including construction, bridges, ships, containers, medical devices and cars. However, the production process of iron and steel is very perplexing, which consists of four processes: ironmaking, steelmaking, continuous casting and rolling. It is also extremely complicated to control the quality of steel during the full manufacturing pro-cess. Therefore, the quality control of steel is considered as a huge challenge for the whole steel industry. This thesis studies the quality control, taking the case of Nanjing Iron and Steel Group, and then provides new approaches for quality analysis, manage-ment and control of the industry. At present, Nanjing Iron and Steel Group has established a quality management and control system, which oversees many systems involved in the steel manufacturing. It poses a high statistical requirement for business professionals, resulting in a limited use of the system. A lot of data of quality has been collected in each system. At present, all systems mainly pay attention to the processing and analysis of the data after the manufacturing process, and the quality problems of the products are mainly tested by sampling-experimental method. This method cannot detect product quality or predict in advance the hidden quality issues in a timely manner. In the quality control system, the responsibilities and functions of different information systems involved are intricate. Each information system is merely responsible for storing the data of its corresponding functions. Hence, the data in each information system is relatively isolated, forming a data island. The iron and steel production process belongs to the process industry. The data in multiple information systems can be combined to analyze and predict the quality of products in depth and provide an early warning alert. Therefore, it is necessary to introduce new product quality control methods in the steel industry. With the waves of industry 4.0 and intelligent manufacturing, intelligent technology has also been in-troduced in the field of quality control to improve the competitiveness of the iron and steel enterprises in the industry. Applying intelligent technology can generate accurate quality analysis and optimal prediction results based on the data distributed in the fac-tory and determine the online adjustment of the production process. This not only gives rise to the product quality control, but is also beneficial to in the reduction of product costs. Inspired from this, this paper provide in-depth discussion in three chapters: (1) For scrap steel to be used as raw material, how to use artificial intelligence algorithms to evaluate its quality grade is studied in chapter 3; (2) the probability that the longi-tudinal crack occurs on the surface of continuous casting slab is studied in chapter 4;(3) The prediction of mechanical properties of finished steel plate in chapter 5. All these 3 chapters will serve as the technical support of quality control in iron and steel production

    Metaheuristics algorithms to identify nonlinear Hammerstein model: A decade survey

    Get PDF
    Metaheuristics have been acknowledged as an effective solution for many difficult issues related to optimization. The metaheuristics, especially swarm’s intelligence and evolutionary computing algorithms, have gained popularity within a short time over the past two decades. Various metaheuristics algorithms are being introduced on an annual basis and applications that are more new are gradually being discovered. This paper presents a survey for the years 2011-2021 on multiple metaheuristics algorithms, particularly swarm and evolutionary algorithms, to identify a nonlinear block-oriented model called the Hammerstein model, mainly because such model has garnered much interest amidst researchers to identify nonlinear systems. Besides introducing a complete survey on the various population-based algorithms to identify the Hammerstein model, this paper also investigated some empirically verified actual process plants results. As such, this article serves as a guideline on the fundamentals of identifying nonlinear block-oriented models for new practitioners, apart from presenting a comprehensive summary of cutting-edge trends within the context of this topic area

    Metallurgical Process Simulation and Optimization

    Get PDF
    Metallurgy involves the art and science of extracting metals from their ores and modifying the metals for use. With thousands of years of development, many interdisciplinary technologies have been introduced into this traditional and large-scale industry. In modern metallurgical practices, modelling and simulation are widely used to provide solutions in the areas of design, control, optimization, and visualization, and are becoming increasingly significant in the progress of digital transformation and intelligent metallurgy. This Special Issue (SI), entitled “Metallurgical Process Simulation and Optimization”, has been organized as a platform to present the recent advances in the field of modelling and optimization of metallurgical processes, which covers the processes of electric/oxygen steel-making, secondary metallurgy, (continuous) casting, and processing. Eighteen articles have been included that concern various aspects of the topic

    Optimisation of the heat treatment of steel using neural networks.

    Get PDF
    Heat treatments are used to develop the required mechanical properties in a range of alloy steels. The typical process involves a hardening stage (including a quench) and a tempering stage. The variation in mechanical properties achieved is influenced by a large number of parameters including tempering temperature, alloying elements added to the cast, quench media and product geometry, along with measurement and process errors. The project aim was to predict the mechanical properties, such as Ultimate Tensile Strength, Proof Stress, Impact Energy, Reduction of Area and Elongation, that would be obtained from the treatment for a wide range of steel types. The project initially investigated a number of data modelling techniques, however, the neural network technique was found to provide the best modelling accuracy, particularly when the data set of heat treatment examples was expanded to include an increased variety of examples. The total data collected through the project comprised over 6000 heat treatment examples, drawn from 6 sites. Having defined a target modelling accuracy, a variety of modelling and data decomposition techniques were employed to try and cope with an uneven data distribution between variables, which encompassed nonlinearity and complex interactions. Having not reached the target accuracy required the quality of the data set was brought into question and a structured procedure for improving data quality was developed using a combination of existing and novel techniques. IV The stability of model predictions was then further improved through the use of an ensemble approach, where multiple networks contribute to each predicted data point. This technique also had the advantage of enabling the reliability of a given prediction to be indicated. Methods of extracting information from the model were then investigated, and a graphical user interface was developed to enable industrial evaluation of the modelling technique. This led to further improvements enabling a user to be provided with an indication of prediction reliability, which is particularly important in an industrial situation. Application areas of the models developed were then demonstrated together with a genetic algorithm optimisation technique, which demonstrates that automatic alloy design under optimal constraints can now be performed

    Data mining for fault diagnosis in steel making process under industry 4.0

    Get PDF
    The concept of Industry 4.0 (I4.0) refers to the intelligent networking of machines and processes in the industry, which is enabled by cyber-physical systems (CPS) - a technology that utilises embedded networked systems to achieve intelligent control. CPS enable full traceability of production processes as well as comprehensive data assignments in real-time. Through real-time communication and coordination between "manufacturing things", production systems, in the form of Cyber-Physical Production Systems (CPPS), can make intelligent decisions. Meanwhile, with the advent of I4.0, it is possible to collect heterogeneous manufacturing data across various facets for fault diagnosis by using the industrial internet of things (IIoT) techniques. Under this data-rich environment, the ability to diagnose and predict production failures provides manufacturing companies with a strategic advantage by reducing the number of unplanned production outages. This advantage is particularly desired for steel-making industries. As a consecutive and compact manufacturing process, process downtime is a major concern for steel-making companies since most of the operations should be conducted within a certain temperature range. In addition, steel-making consists of complex processes that involve physical, chemical, and mechanical elements, emphasising the necessity for data-driven approaches to handle high-dimensionality problems. For a modern steel-making plant, various measurement devices are deployed throughout this manufacturing process with the advancement of I4.0 technologies, which facilitate data acquisition and storage. However, even though data-driven approaches are showing merits and being widely applied in the manufacturing context, how to build a deep learning model for fault prediction in the steel-making process considering multiple contributing facets and its temporal characteristic has not been investigated. Additionally, apart from the multitudinous data, it is also worthwhile to study how to represent and utilise the vast and scattered distributed domain knowledge along the steel-making process for fault modelling. Moreover, state-of-the-art does not iv Abstract address how such accumulated domain knowledge and its semantics can be harnessed to facilitate the fusion of multi-sourced data in steel manufacturing. In this case, the purpose of this thesis is to pave the way for fault diagnosis in steel-making processes using data mining under I4.0. This research is structured according to four themes. Firstly, different from the conventional data-driven research that only focuses on modelling based on numerical production data, a framework for data mining for fault diagnosis in steel-making based on multi-sourced data and knowledge is proposed. There are five layers designed in this framework, which are multi-sourced data and knowledge acquisition, data and knowledge processing, KG construction and graphical data transformation, KG-aided modelling for fault diagnosis and decision support for steel manufacturing. Secondly, another of the purposes of this thesis is to propose a predictive, data-driven approach to model severe faults in the steel-making process, where the faults are usually with multi-faceted causes. Specifically, strip breakage in cold rolling is selected as the modelling target since it is a typical production failure with serious consequences and multitudinous factors contributing to it. In actual steel-making practice, if such a failure can be modelled on a micro-level with an adequately predicted window, a planned stop action can be taken in advance instead of a passive fast stop which will often result in severe damage to equipment. In this case, a multifaceted modelling approach with a sliding window strategy is proposed. First, historical multivariate time-series data of a cold rolling process were extracted in a run-to-failure manner, and a sliding window strategy was adopted for data annotation. Second, breakage-centric features were identified from physics-based approaches, empirical knowledge and data-driven features. Finally, these features were used as inputs for strip breakage modelling using a Recurrent Neural Network (RNN). Experimental results have demonstrated the merits of the proposed approach. Thirdly, among the heterogeneous data surrounding multi-faceted concepts in steelmaking, a significant amount of data consists of rich semantic information, such as technical documents and production logs generated through the process. Also, there Abstract v exists vast domain knowledge regarding the production failures in steel-making, which has a long history. In this context, proper semantic technologies are desired for the utilisation of semantic data and domain knowledge in steel-making. In recent studies, a Knowledge Graph (KG) displays a powerful expressive ability and a high degree of modelling flexibility, making it a promising semantic network. However, building a reliable KG is usually time-consuming and labour-intensive, and it is common that KG needs to be refined or completed before using in industrial scenarios. In this case, a fault-centric KG construction approach is proposed based on a hierarchy structure refinement and relation completion. Firstly, ontology design based on hierarchy structure refinement is conducted to improve reliability. Then, the missing relations between each couple of entities were inferred based on existing knowledge in KG, with the aim of increasing the number of edges that complete and refine KG. Lastly, KG is constructed by importing data into the ontology. An illustrative case study on strip breakage is conducted for validation. Finally, multi-faceted modelling is often conducted based on multi-sourced data covering indispensable aspects, and information fusion is typically applied to cope with the high dimensionality and data heterogeneity. Besides the ability for knowledge management and sharing, KG can aggregate the relationships of features from multiple aspects by semantic associations, which can be exploited to facilitate the information fusion for multi-faceted modelling with the consideration of intra-facets relationships. In this case, process data is transformed into a stack of temporal graphs under the faultcentric KG backbone. Then, a Graph Convolutional Networks (GCN) model is applied to extract temporal and attribute correlation features from the graphs, with a Temporal Convolution Network (TCN) to conduct conceptual modelling using these features. Experimental results derived using the proposed approach, and GCN-TCN reveal the impacts of the proposed KG-aided fusion approach. This thesis aims to research data mining in steel-making processes based on multisourced data and scattered distributed domain knowledge, which provides a feasibility study for achieving Industry 4.0 in steel-making, specifically in support of improving quality and reducing costs due to production failures

    Source Apportionment and Forecasting of Aerosol in a Steel City - Case Study of Rourkela

    Get PDF
    Urban air pollution is one of the biggest problems ascending due to rapid urbanization and industrialization. The improvement of air quality in an urban area in general, constitutes of three phases, monitoring, modeling and control measures. The present research work addresses the requirements of the urban air quality management programme (UAQMP) in Rourkela steel city. A typical UAQMP contains three aspects: monitoring of air pollution, modeling of air pollution and taking control measures. The present study aims to conduct the modeling of particulate air pollution for a steel city. Modeling of particulate matter (PM) pollution is nothing but the application of different mathematical models in source apportionment and forecasting of PM. PM (PM10 and TSP) was collected twice a week for two years (2011-2012) during working hours in Rourkela. The seasonal variations study of PM showed that the aerosol concentration was high during summer and low during monsoon. A detailed chemical characterization of both PM10 and TSP was carried out to find out the concentrations of different metal ions, anions and carbon content. The Spearman rank correlation analysis between different chemical species of PM depicted the presence of both crustal and anthropogenic origins in particulate matter. The enrichment factor analysis highlighted the presence of anthropogenic sources. Three major receptor models were used for the source apportionment of PM, namely chemical mass balance model (CMB), principal component analysis (PCA) and positive matrix factorization (PMF). In selecting source profiles for CMB, an effort has been put to select the profiles which represent the local conditions. Two of the profiles, namely soil dust and road dust, were developed in the present study for better accuracy. All three receptor models have shown that industrial (40-45%) and combustion sources (30-35%) were major contributors to particulate pollution in Rourkela. Artificial neural networks (ANN) were used for the prediction of particulate pollution using meteorological parameters as inputs. The emphasis is to compare the performances of MLP and RBF algorithms in forecasting and provide a rigorous inter-comparison as a first step toward operational PM forecasting models. The training, testing and validation errors of MLP networks are significantly lower than that of RBF networks. The results indicate that both MLP and RBF have shown good prediction capabilities while MLP networks were better than that of RBF networks. There is no profound bias that can be seen in the models which may also suggest that there are very few or zero external factors that may influence the dispersion and distribution of particulate matter in the study area
    corecore