33 research outputs found

    Rails Quality Data Modelling via Machine Learning-Based Paradigms

    Get PDF

    A Literature Review of Fault Diagnosis Based on Ensemble Learning

    Get PDF
    The accuracy of fault diagnosis is an important indicator to ensure the reliability of key equipment systems. Ensemble learning integrates different weak learning methods to obtain stronger learning and has achieved remarkable results in the field of fault diagnosis. This paper reviews the recent research on ensemble learning from both technical and field application perspectives. The paper summarizes 87 journals in recent web of science and other academic resources, with a total of 209 papers. It summarizes 78 different ensemble learning based fault diagnosis methods, involving 18 public datasets and more than 20 different equipment systems. In detail, the paper summarizes the accuracy rates, fault classification types, fault datasets, used data signals, learners (traditional machine learning or deep learning-based learners), ensemble learning methods (bagging, boosting, stacking and other ensemble models) of these fault diagnosis models. The paper uses accuracy of fault diagnosis as the main evaluation metrics supplemented by generalization and imbalanced data processing ability to evaluate the performance of those ensemble learning methods. The discussion and evaluation of these methods lead to valuable research references in identifying and developing appropriate intelligent fault diagnosis models for various equipment. This paper also discusses and explores the technical challenges, lessons learned from the review and future development directions in the field of ensemble learning based fault diagnosis and intelligent maintenance

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Intelligent Feature Extraction, Data Fusion and Detection of Concrete Bridge Cracks: Current Development and Challenges

    Full text link
    As a common appearance defect of concrete bridges, cracks are important indices for bridge structure health assessment. Although there has been much research on crack identification, research on the evolution mechanism of bridge cracks is still far from practical applications. In this paper, the state-of-the-art research on intelligent theories and methodologies for intelligent feature extraction, data fusion and crack detection based on data-driven approaches is comprehensively reviewed. The research is discussed from three aspects: the feature extraction level of the multimodal parameters of bridge cracks, the description level and the diagnosis level of the bridge crack damage states. We focus on previous research concerning the quantitative characterization problems of multimodal parameters of bridge cracks and their implementation in crack identification, while highlighting some of their major drawbacks. In addition, the current challenges and potential future research directions are discussed.Comment: Published at Intelligence & Robotics; Its copyright belongs to author

    Data mining for fault diagnosis in steel making process under industry 4.0

    Get PDF
    The concept of Industry 4.0 (I4.0) refers to the intelligent networking of machines and processes in the industry, which is enabled by cyber-physical systems (CPS) - a technology that utilises embedded networked systems to achieve intelligent control. CPS enable full traceability of production processes as well as comprehensive data assignments in real-time. Through real-time communication and coordination between "manufacturing things", production systems, in the form of Cyber-Physical Production Systems (CPPS), can make intelligent decisions. Meanwhile, with the advent of I4.0, it is possible to collect heterogeneous manufacturing data across various facets for fault diagnosis by using the industrial internet of things (IIoT) techniques. Under this data-rich environment, the ability to diagnose and predict production failures provides manufacturing companies with a strategic advantage by reducing the number of unplanned production outages. This advantage is particularly desired for steel-making industries. As a consecutive and compact manufacturing process, process downtime is a major concern for steel-making companies since most of the operations should be conducted within a certain temperature range. In addition, steel-making consists of complex processes that involve physical, chemical, and mechanical elements, emphasising the necessity for data-driven approaches to handle high-dimensionality problems. For a modern steel-making plant, various measurement devices are deployed throughout this manufacturing process with the advancement of I4.0 technologies, which facilitate data acquisition and storage. However, even though data-driven approaches are showing merits and being widely applied in the manufacturing context, how to build a deep learning model for fault prediction in the steel-making process considering multiple contributing facets and its temporal characteristic has not been investigated. Additionally, apart from the multitudinous data, it is also worthwhile to study how to represent and utilise the vast and scattered distributed domain knowledge along the steel-making process for fault modelling. Moreover, state-of-the-art does not iv Abstract address how such accumulated domain knowledge and its semantics can be harnessed to facilitate the fusion of multi-sourced data in steel manufacturing. In this case, the purpose of this thesis is to pave the way for fault diagnosis in steel-making processes using data mining under I4.0. This research is structured according to four themes. Firstly, different from the conventional data-driven research that only focuses on modelling based on numerical production data, a framework for data mining for fault diagnosis in steel-making based on multi-sourced data and knowledge is proposed. There are five layers designed in this framework, which are multi-sourced data and knowledge acquisition, data and knowledge processing, KG construction and graphical data transformation, KG-aided modelling for fault diagnosis and decision support for steel manufacturing. Secondly, another of the purposes of this thesis is to propose a predictive, data-driven approach to model severe faults in the steel-making process, where the faults are usually with multi-faceted causes. Specifically, strip breakage in cold rolling is selected as the modelling target since it is a typical production failure with serious consequences and multitudinous factors contributing to it. In actual steel-making practice, if such a failure can be modelled on a micro-level with an adequately predicted window, a planned stop action can be taken in advance instead of a passive fast stop which will often result in severe damage to equipment. In this case, a multifaceted modelling approach with a sliding window strategy is proposed. First, historical multivariate time-series data of a cold rolling process were extracted in a run-to-failure manner, and a sliding window strategy was adopted for data annotation. Second, breakage-centric features were identified from physics-based approaches, empirical knowledge and data-driven features. Finally, these features were used as inputs for strip breakage modelling using a Recurrent Neural Network (RNN). Experimental results have demonstrated the merits of the proposed approach. Thirdly, among the heterogeneous data surrounding multi-faceted concepts in steelmaking, a significant amount of data consists of rich semantic information, such as technical documents and production logs generated through the process. Also, there Abstract v exists vast domain knowledge regarding the production failures in steel-making, which has a long history. In this context, proper semantic technologies are desired for the utilisation of semantic data and domain knowledge in steel-making. In recent studies, a Knowledge Graph (KG) displays a powerful expressive ability and a high degree of modelling flexibility, making it a promising semantic network. However, building a reliable KG is usually time-consuming and labour-intensive, and it is common that KG needs to be refined or completed before using in industrial scenarios. In this case, a fault-centric KG construction approach is proposed based on a hierarchy structure refinement and relation completion. Firstly, ontology design based on hierarchy structure refinement is conducted to improve reliability. Then, the missing relations between each couple of entities were inferred based on existing knowledge in KG, with the aim of increasing the number of edges that complete and refine KG. Lastly, KG is constructed by importing data into the ontology. An illustrative case study on strip breakage is conducted for validation. Finally, multi-faceted modelling is often conducted based on multi-sourced data covering indispensable aspects, and information fusion is typically applied to cope with the high dimensionality and data heterogeneity. Besides the ability for knowledge management and sharing, KG can aggregate the relationships of features from multiple aspects by semantic associations, which can be exploited to facilitate the information fusion for multi-faceted modelling with the consideration of intra-facets relationships. In this case, process data is transformed into a stack of temporal graphs under the faultcentric KG backbone. Then, a Graph Convolutional Networks (GCN) model is applied to extract temporal and attribute correlation features from the graphs, with a Temporal Convolution Network (TCN) to conduct conceptual modelling using these features. Experimental results derived using the proposed approach, and GCN-TCN reveal the impacts of the proposed KG-aided fusion approach. This thesis aims to research data mining in steel-making processes based on multisourced data and scattered distributed domain knowledge, which provides a feasibility study for achieving Industry 4.0 in steel-making, specifically in support of improving quality and reducing costs due to production failures

    MachNet, a general deep learning architecture for predictive maintenance within the industry 4.0 paradigm

    Get PDF
    In the Industry 4.0 era, a myriad of sensors of diverse nature (temperature, pressure, etc.) is spreading throughout the entire value chain of industries, being potentially exploitable for multiple purposes, such as Predictive Maintenance (PdM): the just-in-time maintenance of industrial assets, which results in reduced operating costs, increased operator safety, etc. Nowadays, industrial processes require to be highly configurable, in order to proactively adapt their operation to diverse factors such as user needs, product updates or supply chain uncertainties. This limits current Industry 4.0-PdM solutions, typically consisting of ad-hoc developments intended for specific scenarios, i.e. they are designed to operate under certain conditions (configurations, employed sensors, etc.), being unable to manage changes in their setup. This paper presents a general Deep Learning (DL) architecture, MachNet, which deals with such heterogeneity and is able to address PdM problems of a diverse nature. The modularity of the proposed architecture enables it to deal with an arbitrary number of sensors of different types, also allowing the integration of prior information (age of assets, material type, etc.), which clearly affects performance and is often neglected. In practice, our architecture effortlessly adapts to the assets’ specifications and to different PdM problems. That is, MachNet becomes an architectural template that can be instantiated for a given scenario. We tested our proposal in two different PdM-related problems: Health State (HS) and Remaining-useful-Life (RuL) estimation, achieving in both cases comparable or superior performance to other state-of-the-art approaches, with the additional advantage of the generality that MachNet offers.Funding for open Access charge: Universidad de Málaga / CBUA. Work partially supported by the grant program FPU17/04512 and the research project ARPEGGIO ([PID2020-117057GB-I00]), both funded by the Spanish Government, and the research project HOUNDBOT ([P20-01302]), financed by the Regional Government of Andalusia with support from the ERDF (European Regional Development Funds). The authors thank the Supercomputing and Bioinnovation Center (SCBI) of the University of Málaga for their provision of computational resources and technical support (www.scbi.uma.es/site); and the support of NVIDIA Corporation with the donation of the Titan X Pascal used for this research

    Towards A Computational Intelligence Framework in Steel Product Quality and Cost Control

    Get PDF
    Steel is a fundamental raw material for all industries. It can be widely used in vari-ous fields, including construction, bridges, ships, containers, medical devices and cars. However, the production process of iron and steel is very perplexing, which consists of four processes: ironmaking, steelmaking, continuous casting and rolling. It is also extremely complicated to control the quality of steel during the full manufacturing pro-cess. Therefore, the quality control of steel is considered as a huge challenge for the whole steel industry. This thesis studies the quality control, taking the case of Nanjing Iron and Steel Group, and then provides new approaches for quality analysis, manage-ment and control of the industry. At present, Nanjing Iron and Steel Group has established a quality management and control system, which oversees many systems involved in the steel manufacturing. It poses a high statistical requirement for business professionals, resulting in a limited use of the system. A lot of data of quality has been collected in each system. At present, all systems mainly pay attention to the processing and analysis of the data after the manufacturing process, and the quality problems of the products are mainly tested by sampling-experimental method. This method cannot detect product quality or predict in advance the hidden quality issues in a timely manner. In the quality control system, the responsibilities and functions of different information systems involved are intricate. Each information system is merely responsible for storing the data of its corresponding functions. Hence, the data in each information system is relatively isolated, forming a data island. The iron and steel production process belongs to the process industry. The data in multiple information systems can be combined to analyze and predict the quality of products in depth and provide an early warning alert. Therefore, it is necessary to introduce new product quality control methods in the steel industry. With the waves of industry 4.0 and intelligent manufacturing, intelligent technology has also been in-troduced in the field of quality control to improve the competitiveness of the iron and steel enterprises in the industry. Applying intelligent technology can generate accurate quality analysis and optimal prediction results based on the data distributed in the fac-tory and determine the online adjustment of the production process. This not only gives rise to the product quality control, but is also beneficial to in the reduction of product costs. Inspired from this, this paper provide in-depth discussion in three chapters: (1) For scrap steel to be used as raw material, how to use artificial intelligence algorithms to evaluate its quality grade is studied in chapter 3; (2) the probability that the longi-tudinal crack occurs on the surface of continuous casting slab is studied in chapter 4;(3) The prediction of mechanical properties of finished steel plate in chapter 5. All these 3 chapters will serve as the technical support of quality control in iron and steel production

    Identifying and Detecting Attacks in Industrial Control Systems

    Get PDF
    The integrity of industrial control systems (ICS) found in utilities, oil and natural gas pipelines, manufacturing plants and transportation is critical to national wellbeing and security. Such systems depend on hundreds of field devices to manage and monitor a physical process. Previously, these devices were specific to ICS but they are now being replaced by general purpose computing technologies and, increasingly, these are being augmented with Internet of Things (IoT) nodes. Whilst there are benefits to this approach in terms of cost and flexibility, it has attracted a wider community of adversaries. These include those with significant domain knowledge, such as those responsible for attacks on Iran’s Nuclear Facilities, a Steel Mill in Germany, and Ukraine’s power grid; however, non specialist attackers are becoming increasingly interested in the physical damage it is possible to cause. At the same time, the approach increases the number and range of vulnerabilities to which ICS are subject; regrettably, conventional techniques for analysing such a large attack space are inadequate, a cause of major national concern. In this thesis we introduce a generalisable approach based on evolutionary multiobjective algorithms to assist in identifying vulnerabilities in complex heterogeneous ICS systems. This is both challenging and an area that is currently lacking research. Our approach has been to review the security of currently deployed ICS systems, and then to make use of an internationally recognised ICS simulation testbed for experiments, assuming that the attacking community largely lack specific ICS knowledge. Using the simulator, we identified vulnerabilities in individual components and then made use of these to generate attacks. A defence against these attacks in the form of novel intrusion detection systems were developed, based on a range of machine learning models. Finally, this was further subject to attacks created using the evolutionary multiobjective algorithms, demonstrating, for the first time, the feasibility of creating sophisticated attacks against a well-protected adversary using automated mechanisms

    A Survey on Intelligent Optimization Approaches to Boiler Combustion Optimization

    Get PDF
    This paper reviews the researches on boiler combustion optimization, which is an important direction in the field of energy saving and emission reduction. Many methods have been used to deal with boiler combustion optimization, among which evolutionary computing (EC) techniques have recently gained much attention. However, the existing researches are not sufficiently focused and have not been summarized systematically. This has led to slow progress of research on boiler combustion optimization and has obstacles in the application. This paper introduces a comprehensive survey of the works of intelligent optimization algorithms in boiler combustion optimization and summarizes the contributions of different optimization algorithms. Finally, this paper discusses new research challenges and outlines future research directions, which can guide boiler combustion optimization to improve energy efficiency and reduce pollutant emission concentrations
    corecore