19 research outputs found

    Towards A Computational Intelligence Framework in Steel Product Quality and Cost Control

    Get PDF
    Steel is a fundamental raw material for all industries. It can be widely used in vari-ous fields, including construction, bridges, ships, containers, medical devices and cars. However, the production process of iron and steel is very perplexing, which consists of four processes: ironmaking, steelmaking, continuous casting and rolling. It is also extremely complicated to control the quality of steel during the full manufacturing pro-cess. Therefore, the quality control of steel is considered as a huge challenge for the whole steel industry. This thesis studies the quality control, taking the case of Nanjing Iron and Steel Group, and then provides new approaches for quality analysis, manage-ment and control of the industry. At present, Nanjing Iron and Steel Group has established a quality management and control system, which oversees many systems involved in the steel manufacturing. It poses a high statistical requirement for business professionals, resulting in a limited use of the system. A lot of data of quality has been collected in each system. At present, all systems mainly pay attention to the processing and analysis of the data after the manufacturing process, and the quality problems of the products are mainly tested by sampling-experimental method. This method cannot detect product quality or predict in advance the hidden quality issues in a timely manner. In the quality control system, the responsibilities and functions of different information systems involved are intricate. Each information system is merely responsible for storing the data of its corresponding functions. Hence, the data in each information system is relatively isolated, forming a data island. The iron and steel production process belongs to the process industry. The data in multiple information systems can be combined to analyze and predict the quality of products in depth and provide an early warning alert. Therefore, it is necessary to introduce new product quality control methods in the steel industry. With the waves of industry 4.0 and intelligent manufacturing, intelligent technology has also been in-troduced in the field of quality control to improve the competitiveness of the iron and steel enterprises in the industry. Applying intelligent technology can generate accurate quality analysis and optimal prediction results based on the data distributed in the fac-tory and determine the online adjustment of the production process. This not only gives rise to the product quality control, but is also beneficial to in the reduction of product costs. Inspired from this, this paper provide in-depth discussion in three chapters: (1) For scrap steel to be used as raw material, how to use artificial intelligence algorithms to evaluate its quality grade is studied in chapter 3; (2) the probability that the longi-tudinal crack occurs on the surface of continuous casting slab is studied in chapter 4;(3) The prediction of mechanical properties of finished steel plate in chapter 5. All these 3 chapters will serve as the technical support of quality control in iron and steel production

    Process Modeling in Pyrometallurgical Engineering

    Get PDF
    The Special Issue presents almost 40 papers on recent research in modeling of pyrometallurgical systems, including physical models, first-principles models, detailed CFD and DEM models as well as statistical models or models based on machine learning. The models cover the whole production chain from raw materials processing through the reduction and conversion unit processes to ladle treatment, casting, and rolling. The papers illustrate how models can be used for shedding light on complex and inaccessible processes characterized by high temperatures and hostile environment, in order to improve process performance, product quality, or yield and to reduce the requirements of virgin raw materials and to suppress harmful emissions

    Energy consumption modelling using deep learning embedded semi-supervised learning

    Get PDF
    Reduction of energy consumption in the steel industry is a global issue where government is actively taking measures to pursue. A steel plant can manage its energy better if the consumption can be modelled and predicted. The existing methods used for energy consumption modelling rely on the quantity of labelled data. However, if the labelled energy consumption data is deficient, its underlying process of modelling and prediction tends to be difficult. The purpose of this study is to establish an energy value prediction model through a big data-driven approach. Owing to the fact that labelled energy data is often limited and expensive to obtain, while unlabelled data is abundant in the real-world industry, a semi-supervised learning approach, i.e., deep learning embedded semi-supervised learning (DLeSSL), is proposed to tackle the issue. Based on DLeSSL, unlabelled data can be labelled and compensated using a semi-supervised learning approach that has a deep learning technique embedded so to expand the labelled data set. An experimental study using a large amount of furnace energy consumption data shows the merits of the proposed approach. Results derived using the proposed method reveal that deep learning (DLeSSL based) outperforms the deep learning (supervised) and deep learning (label propagation based) when the labelled data is limited. In addition, the effect on performance due to the size of labelled data and unlabelled data is also reported

    Predictive model-based quality inspection using Machine Learning and Edge Cloud Computing

    Full text link
    © 2020 The Authors The supply of defect-free, high-quality products is an important success factor for the long-term competitiveness of manufacturing companies. Despite the increasing challenges of rising product variety and complexity and the necessity of economic manufacturing, a comprehensive and reliable quality inspection is often indispensable. In consequence, high inspection volumes turn inspection processes into manufacturing bottlenecks. In this contribution, we investigate a new integrated solution of predictive model-based quality inspection in industrial manufacturing by utilizing Machine Learning techniques and Edge Cloud Computing technology. In contrast to state-of-the-art contributions, we propose a holistic approach comprising the target-oriented data acquisition and processing, modelling and model deployment as well as the technological implementation in the existing IT plant infrastructure. A real industrial use case in SMT manufacturing is presented to underline the procedure and benefits of the proposed method. The results show that by employing the proposed method, inspection volumes can be reduced significantly and thus economic advantages can be generated

    Optimisation of the heat treatment of steel using neural networks.

    Get PDF
    Heat treatments are used to develop the required mechanical properties in a range of alloy steels. The typical process involves a hardening stage (including a quench) and a tempering stage. The variation in mechanical properties achieved is influenced by a large number of parameters including tempering temperature, alloying elements added to the cast, quench media and product geometry, along with measurement and process errors. The project aim was to predict the mechanical properties, such as Ultimate Tensile Strength, Proof Stress, Impact Energy, Reduction of Area and Elongation, that would be obtained from the treatment for a wide range of steel types. The project initially investigated a number of data modelling techniques, however, the neural network technique was found to provide the best modelling accuracy, particularly when the data set of heat treatment examples was expanded to include an increased variety of examples. The total data collected through the project comprised over 6000 heat treatment examples, drawn from 6 sites. Having defined a target modelling accuracy, a variety of modelling and data decomposition techniques were employed to try and cope with an uneven data distribution between variables, which encompassed nonlinearity and complex interactions. Having not reached the target accuracy required the quality of the data set was brought into question and a structured procedure for improving data quality was developed using a combination of existing and novel techniques. IV The stability of model predictions was then further improved through the use of an ensemble approach, where multiple networks contribute to each predicted data point. This technique also had the advantage of enabling the reliability of a given prediction to be indicated. Methods of extracting information from the model were then investigated, and a graphical user interface was developed to enable industrial evaluation of the modelling technique. This led to further improvements enabling a user to be provided with an indication of prediction reliability, which is particularly important in an industrial situation. Application areas of the models developed were then demonstrated together with a genetic algorithm optimisation technique, which demonstrates that automatic alloy design under optimal constraints can now be performed

    Knowledge-driven Artificial Intelligence in Steelmaking: Towards Industry 4.0

    Get PDF
    With the ongoing emergence of the Fourth Industrial Revolution, often referred to as Indus-try 4.0, new innovations, concepts, and standards are reshaping manufacturing processes and production, leading to intelligent cyber-physical systems and smart factories. Steel production is one important manufacturing process that is undergoing this digital transfor-mation. Realising this vision in steel production comes with unique challenges, including the seamless interoperability between diverse and complex systems, the uniformity of het-erogeneous data, and a need for standardised human-to-machine and machine-to-machine communication protocols. To address these challenges, international standards have been developed, and new technologies have been introduced and studied in both industry and academia. However, due to the vast quantity, scale, and heterogeneous nature of industrial data and systems, achieving interoperability among components within the context of Industry 4.0 remains a challenge, requiring the need for formal knowledge representation capabilities to enhance the understanding of data and information. In response, semantic-based technologies have been proposed as a method to capture knowledge from data and resolve incompatibility conflicts within Industry 4.0 scenarios. We propose utilising fundamental Semantic Web concepts, such as ontologies and knowledge graphs, specifically to enhance semantic interoperability, improve data integration, and standardise data across heterogeneous systems within the context of steelmaking. Addition-ally, we investigate ongoing trends that involve the integration of Machine Learning (ML)techniques with semantic technologies, resulting in the creation of hybrid models. These models capitalise on the strengths derived from the intersection of these two AI approaches.Furthermore, we explore the need for continuous reasoning over data streams, presenting preliminary research that combines ML and semantic technologies in the context of data streams. In this thesis, we make four main contributions: (1) We discover that a clear under-standing of semantic-based asset administration shells, an international standard within the RAMI 4.0 model, was lacking, and provide an extensive survey on semantic-based implementations of asset administration shells. We focus on literature that utilises semantic technologies to enhance the representation, integration, and exchange of information in an industrial setting. (2) The creation of an ontology, a semantic knowledge base, which specifically captures the cold rolling processes in steelmaking. We demonstrate use cases that leverage these semantic methodologies with real-world industrial data for data access, data integration, data querying, and condition-based maintenance purposes. (3) A frame-work demonstrating one approach for integrating machine learning models with semantic technologies to aid decision-making in the domain of steelmaking. We showcase a novel approach of applying random forest classification using rule-based reasoning, incorporating both meta-data and external domain expert knowledge into the model, resulting in improved knowledge-guided assistance for the human-in-the-loop during steelmaking processes. (4) The groundwork for a continuous data stream reasoning framework, where both domain expert knowledge and random forest classification can be dynamically applied to data streams on the fly. This approach opens up possibilities for real-time condition-based monitoring and real-time decision support for predictive maintenance applications. We demonstrate the adaptability of the framework in the context of dynamic steel production processes. Our contributions have been validated on both real-world data sets with peer-reviewed conferences and journals, as well as through collaboration with domain experts from our industrial partners at Tata Steel

    Rails Quality Data Modelling via Machine Learning-Based Paradigms

    Get PDF

    Computational intelligence image processing for precision farming on-site nitrogen analysis in plants

    Get PDF
    PhD ThesisNitrogen is one of the macronutrients which is essentially required by plants. To support the precision farming, it is important to analyse nitrogen status in plants in order to prevent excessive fertilisation as well as to reduce production costs. Image-based analysis has been widely utilised to estimate nitrogen content in plants. Such research, however, is commonly conducted in a controlled environment with artificial lighting systems. This thesis proposes three novel computational intelligence systems to evaluate nitrogen status in wheat plants by analysing plant images captured on field and are subject to variation in lighting conditions. In the first proposed method, a fusion of regularised neural networks (NN) has been employed to normalise plant images based on the RGB colour of the 24-patch Macbeth colour checker. The colour normalisation results are then optimised using genetic algorithm (GA). The regularised neural network has also been effectively utilised to distinguish wheat leaves from other unwanted parts. This method gives improved results compared to the Otsu algorithm. Furthermore, several neural networks with different number of hidden layer nodes are combined using committee machines and optimised by GA to estimate nitrogen content. In the second proposed method, the utilisation of regularised NN has been replaced by deep sparse extreme learning machine (DSELM). In general the utilisation of DSELM in the three research steps is as effective as that of the developed regularised NN as proposed in the first method. However, the learning speed of DSELM is extremely faster than the regularised NN and the standard backpropagation multilayer perceptron (MLP). In the third proposed method, a novel approach has been developed to fine tune the colour normalisation based on the nutrient estimation errors and analyse the effect of genetic algorithm based global optimisation on the nitrogen estimation results. In this method, an ensemble of deep learning MLP (DL-MLP) has been employed in the three research steps, i.e. colour normalisation, image segmentation and nitrogen estimation. The performance of the three proposed methods has been compared with the intrusive SPAD meter and the results show that all the proposed methods are superior to the SPAD based estimation. The nutrient estimation errors of the proposed methods are less than 3%, while the error using the renowned SPAD meter method is 8.48%. As a comparison, nitrogen prediction using other methods, i.e. Kawashima greenness index () and PCA-based greenness index () are also calculated. The prediction errors by means of and methods are 9.84% and 9.20%, respectively.Indonesia Ministry of Research, Technology and Higher Education and Jenderal Soedirman Univerist

    Developing ECO2: a performance based ecological and economic framework and tool for sustainability assessment of concrete

    Get PDF
    The use of concrete is associated with immense negative environmental impacts. More than 50 billion tonnes of aggregates are extracted annually for use in concrete, which presents high risks of depleting natural resources. Moreover, concrete has an embodied carbon footprint of 350 kg eq CO2/m3 on average of which 90% is attributable to the production of ordinary Portland cement (OPC). Although this is less than that of steel and most polymers per unit mass, the intensive use of concrete results in an alarming 7% share of the global carbon emissions. Therefore, increasing interest is being directed towards producing sustainable concrete. Conducting a Life cycle assessment (LCA) is a widely accepted tool to assess and compare the acclaimed environmental gains of these sustainable concrete types, while calculating the base line cost of each of these mixes could suffice for economic comparisons. However, sustainability is a multifaceted concept and in order to validate the sustainable of a concrete mix, multi criteria sustainability frameworks are needed. The critical examination of the only two frameworks found in the literature that fits this description, MARS-SC and CONCRETop, showed the need to develop a new one that covers their gaps, which inspired the main contribution in this PhD project. A novel ECOnomic and ECOlogical assessment framework for concrete (hence the name ECO2 which also refers to the symbolic carbon dioxide formula) was created with the following distinguishing features: 1. The scope specified for the LCA study is selected as Cradle-to-Grave in order to account for the whole life cycle of concrete. Therefore, the LCA inventory data, for which sitespecific primary data is prioritized, would include upstream data such as the impact allocation from previous processes from which the raw materials originated and downstream data such as the demolition and disposal impact of concrete. 2. The ECO2 framework considers the amount of carbon sequestration, which is the term used to describe how much carbon dioxide is absorbed by concrete from the environment. The accurate calculation of the carbon footprint of a concrete mix is vital for its absolute environmental impact assessment, but would soon in the near future also affect its economic impact when carbon taxation becomes a normal practice. Aside from filling the technical gaps of the sustainability assessment method, the main contribution the ECO2 framework brings is a shift in the philosophy related to the inclusion of the concrete performance in the process. In both reviewed frameworks (MARS-SC and CONCRETop), concrete performance is assessed as a separate pillar of sustainability perpetuating that the higher performance is rewarded with a higher sustainability index value. Instead, the ECO2 framework brings forward a two layered performance based methodology that promotes a value of resource efficiency. First, the user sets a minimum requirement for the workability and strength depending on the project specifications. The second layer is to correlate the expected service life of each qualifying concrete mix to the required service life of the concrete application within the project through a factor N. This factor, for which the minimum value is 1, is then multiplied by the functional unit used for the LCA to ensure that the economic and ecological assessment are not only accurate but also truly reflective of sustainability. An MS excel tool was also developed to self-validate the ECO2 framework in what could be labelled as a methodical contribution. Finally, three case studies were conducted using the newly developed ECO2 framework as follows: 1. The first case study was experimental using electric arc furnace slag as a precursor for alkali activated concrete and comparing its ECO2 sustainability index to a basic alkali activated concrete mix based on fly ash as a precursor. The case study showed that the deterioration in the mechanical properties of the novel alkali activated slag concrete largely overshadow the ecological and economic merits of recycling it. 2. The second case study was analytical using a database of more than 2500 data points to predict and hence optimize the functional, environmental and economic performance of blended cement concrete using the ECO2 framework. The mixes included varying combinations of five different types of SCMs based on plain and reinforced concrete scenarios of different strength and service life requirements. 3. The final case study was prepared to investigate an issue facing the UK Green concrete market which is the need to shut down all coal operated electrical power plants by 2022 and the subsequent absence of fly ash. The case study used the ECO2 framework to compare between importing fly ash from China, Germany and recycling locally existing stockpiled fly ash in the UK. The vital parameter in the comparison was the environmental and economic impact resulting from the transportation of fly ash from its source to the location of the concrete batch plant in the UK
    corecore