303 research outputs found

    Continuous maintenance and the future – Foundations and technological challenges

    Get PDF
    High value and long life products require continuous maintenance throughout their life cycle to achieve required performance with optimum through-life cost. This paper presents foundations and technologies required to offer the maintenance service. Component and system level degradation science, assessment and modelling along with life cycle ‘big data’ analytics are the two most important knowledge and skill base required for the continuous maintenance. Advanced computing and visualisation technologies will improve efficiency of the maintenance and reduce through-life cost of the product. Future of continuous maintenance within the Industry 4.0 context also identifies the role of IoT, standards and cyber security

    Towards the Internet of Smart Trains: A Review on Industrial IoT-Connected Railways

    Get PDF
    [Abstract] Nowadays, the railway industry is in a position where it is able to exploit the opportunities created by the IIoT (Industrial Internet of Things) and enabling communication technologies under the paradigm of Internet of Trains. This review details the evolution of communication technologies since the deployment of GSM-R, describing the main alternatives and how railway requirements, specifications and recommendations have evolved over time. The advantages of the latest generation of broadband communication systems (e.g., LTE, 5G, IEEE 802.11ad) and the emergence of Wireless Sensor Networks (WSNs) for the railway environment are also explained together with the strategic roadmap to ensure a smooth migration from GSM-R. Furthermore, this survey focuses on providing a holistic approach, identifying scenarios and architectures where railways could leverage better commercial IIoT capabilities. After reviewing the main industrial developments, short and medium-term IIoT-enabled services for smart railways are evaluated. Then, it is analyzed the latest research on predictive maintenance, smart infrastructure, advanced monitoring of assets, video surveillance systems, railway operations, Passenger and Freight Information Systems (PIS/FIS), train control systems, safety assurance, signaling systems, cyber security and energy efficiency. Overall, it can be stated that the aim of this article is to provide a detailed examination of the state-of-the-art of different technologies and services that will revolutionize the railway industry and will allow for confronting today challenges.Galicia. Consellería de Cultura, Educación e Ordenación Universitaria; ED431C 2016-045Galicia. Consellería de Cultura, Educación e Ordenación Universitaria; ED341D R2016/012Galicia. Consellería de Cultura, Educación e Ordenación Universitaria; ED431G/01Agencia Estatal de Investigación (España); TEC2013-47141-C4-1-RAgencia Estatal de Investigación (España); TEC2015-69648-REDCAgencia Estatal de Investigación (España); TEC2016-75067-C4-1-

    Data mining for fault diagnosis in steel making process under industry 4.0

    Get PDF
    The concept of Industry 4.0 (I4.0) refers to the intelligent networking of machines and processes in the industry, which is enabled by cyber-physical systems (CPS) - a technology that utilises embedded networked systems to achieve intelligent control. CPS enable full traceability of production processes as well as comprehensive data assignments in real-time. Through real-time communication and coordination between "manufacturing things", production systems, in the form of Cyber-Physical Production Systems (CPPS), can make intelligent decisions. Meanwhile, with the advent of I4.0, it is possible to collect heterogeneous manufacturing data across various facets for fault diagnosis by using the industrial internet of things (IIoT) techniques. Under this data-rich environment, the ability to diagnose and predict production failures provides manufacturing companies with a strategic advantage by reducing the number of unplanned production outages. This advantage is particularly desired for steel-making industries. As a consecutive and compact manufacturing process, process downtime is a major concern for steel-making companies since most of the operations should be conducted within a certain temperature range. In addition, steel-making consists of complex processes that involve physical, chemical, and mechanical elements, emphasising the necessity for data-driven approaches to handle high-dimensionality problems. For a modern steel-making plant, various measurement devices are deployed throughout this manufacturing process with the advancement of I4.0 technologies, which facilitate data acquisition and storage. However, even though data-driven approaches are showing merits and being widely applied in the manufacturing context, how to build a deep learning model for fault prediction in the steel-making process considering multiple contributing facets and its temporal characteristic has not been investigated. Additionally, apart from the multitudinous data, it is also worthwhile to study how to represent and utilise the vast and scattered distributed domain knowledge along the steel-making process for fault modelling. Moreover, state-of-the-art does not iv Abstract address how such accumulated domain knowledge and its semantics can be harnessed to facilitate the fusion of multi-sourced data in steel manufacturing. In this case, the purpose of this thesis is to pave the way for fault diagnosis in steel-making processes using data mining under I4.0. This research is structured according to four themes. Firstly, different from the conventional data-driven research that only focuses on modelling based on numerical production data, a framework for data mining for fault diagnosis in steel-making based on multi-sourced data and knowledge is proposed. There are five layers designed in this framework, which are multi-sourced data and knowledge acquisition, data and knowledge processing, KG construction and graphical data transformation, KG-aided modelling for fault diagnosis and decision support for steel manufacturing. Secondly, another of the purposes of this thesis is to propose a predictive, data-driven approach to model severe faults in the steel-making process, where the faults are usually with multi-faceted causes. Specifically, strip breakage in cold rolling is selected as the modelling target since it is a typical production failure with serious consequences and multitudinous factors contributing to it. In actual steel-making practice, if such a failure can be modelled on a micro-level with an adequately predicted window, a planned stop action can be taken in advance instead of a passive fast stop which will often result in severe damage to equipment. In this case, a multifaceted modelling approach with a sliding window strategy is proposed. First, historical multivariate time-series data of a cold rolling process were extracted in a run-to-failure manner, and a sliding window strategy was adopted for data annotation. Second, breakage-centric features were identified from physics-based approaches, empirical knowledge and data-driven features. Finally, these features were used as inputs for strip breakage modelling using a Recurrent Neural Network (RNN). Experimental results have demonstrated the merits of the proposed approach. Thirdly, among the heterogeneous data surrounding multi-faceted concepts in steelmaking, a significant amount of data consists of rich semantic information, such as technical documents and production logs generated through the process. Also, there Abstract v exists vast domain knowledge regarding the production failures in steel-making, which has a long history. In this context, proper semantic technologies are desired for the utilisation of semantic data and domain knowledge in steel-making. In recent studies, a Knowledge Graph (KG) displays a powerful expressive ability and a high degree of modelling flexibility, making it a promising semantic network. However, building a reliable KG is usually time-consuming and labour-intensive, and it is common that KG needs to be refined or completed before using in industrial scenarios. In this case, a fault-centric KG construction approach is proposed based on a hierarchy structure refinement and relation completion. Firstly, ontology design based on hierarchy structure refinement is conducted to improve reliability. Then, the missing relations between each couple of entities were inferred based on existing knowledge in KG, with the aim of increasing the number of edges that complete and refine KG. Lastly, KG is constructed by importing data into the ontology. An illustrative case study on strip breakage is conducted for validation. Finally, multi-faceted modelling is often conducted based on multi-sourced data covering indispensable aspects, and information fusion is typically applied to cope with the high dimensionality and data heterogeneity. Besides the ability for knowledge management and sharing, KG can aggregate the relationships of features from multiple aspects by semantic associations, which can be exploited to facilitate the information fusion for multi-faceted modelling with the consideration of intra-facets relationships. In this case, process data is transformed into a stack of temporal graphs under the faultcentric KG backbone. Then, a Graph Convolutional Networks (GCN) model is applied to extract temporal and attribute correlation features from the graphs, with a Temporal Convolution Network (TCN) to conduct conceptual modelling using these features. Experimental results derived using the proposed approach, and GCN-TCN reveal the impacts of the proposed KG-aided fusion approach. This thesis aims to research data mining in steel-making processes based on multisourced data and scattered distributed domain knowledge, which provides a feasibility study for achieving Industry 4.0 in steel-making, specifically in support of improving quality and reducing costs due to production failures

    Ontology-based augmented reality content-related techniques and their impact in knowledge capture and re-use within maintenance diagnosis

    Get PDF
    This PhD thesis aims to study ontology-based AR content-related methods and their impact in knowledge transfer, capture and re-use for cost-effective human knowledge integration in digital diagnostic systems. Industry 4.0 has revealed the importance of maintainers’ knowledge capture and re-use in diagnostics systems for providing satisfactory solutions in cases where those systems cannot (e.g. nofault-found). Augmented Reality (AR) utilises content-related techniques to transfer knowledge to maintainers for improving efficiency and effectiveness of diagnosis tasks. Academic literature has shown that AR can also be utilised for knowledge capture and re-use, but this has only been demonstrated in simple, step-by-step repair operations. In diagnosis research, ontology-based methods are applied to capture and re-use knowledge from unstructured and heterogenous sources like humans. Nevertheless, these methods have not made use of AR potential to contextualise knowledge and so, improve efficiency and effectiveness of knowledge capture and re-use diagnosis operations...[cont.]Manufacturin

    Quantitative Risk Analysis using Real-time Data and Change-point Analysis for Data-informed Risk Prediction

    Get PDF
    Incidents in highly hazardous process industries (HHPI) are a major concern for various stakeholders due to the impact on human lives, environment, and potentially huge financial losses. Because process activities, location and products are unique, risk analysis techniques applied in the HHPI has evolved over the years. Unfortunately, some limitations of the various quantitative risk analysis (QRA) method currently employed means alternative or more improved methods are required. This research has obtained one such method called Big Data QRA Method. This method relies entirely on big data techniques and real-time process data to identify the point at which process risk is imminent and provide the extent of contribution of other components interacting up to the time index of the risk. Unlike the existing QRA methods which are static and based on unvalidated assumptions and data from single case studies, the big data method is dynamic and can be applied to most process systems. This alternative method is my original contribution to science and the practice of risk analysis The detailed procedure which has been provided in Chapter 9 of this thesis applies multiple change-point analysis and other big data techniques like, (a) time series analysis, (b) data exploration and compression techniques, (c) decision tree modelling, (d) linear regression modelling. Since the distributional properties of process data can change over time, the big data approach was found to be more appropriate. Considering the unique conditions, activities and the process systems use within the HHPI, the dust fire and explosion incidents at the Imperial Sugar Factory and the New England Wood Pellet LLC both of which occurred in the USA were found to be suitable case histories to use as a guide for evaluation of data in this research. Data analysis was performed using open source software packages in R Studio. Based on the investigation, multiple-change-point analysis packages strucchange and changepoint were found to be successful at detecting early signs of deteriorating conditions of component in process equipment and the main process risk. One such process component is a bearing which was suspected as the source of ignition which led to the dust fire and explosion at the Imperial Sugar Factory. As a result, this this research applies the big data QRA method procedure to bearing vibration data to predict early deterioration of bearings and final period when the bearing’s performance begins the final phase of deterioration to failure. Model-based identification of these periods provides an indication of whether the conditions of a mechanical part in process equipment at a particular moment represent an unacceptable risk. The procedure starts with selection of process operation data based on the findings of an incident investigation report on the case history of a known process incident. As the defining components of risk, both the frequency and consequences associated with the risk were obtained from the incident investigation reports. Acceptance criteria for the risk can be applied to the periods between the risks detected by the two change-point packages. The method was validated with two case study datasets to demonstrate its applicability as procedure for QRA. The procedure was then tested with two other case study datasets as examples of its application as a QRA method. The insight obtained from the validation and the applied examples led to the conclusion that big data techniques can be applied to real-time process data for risk assessment in the HHPI
    • …
    corecore