3,024 research outputs found

    Plant-Wide Diagnosis: Cause-and-Effect Analysis Using Process Connectivity and Directionality Information

    Get PDF
    Production plants used in modern process industry must produce products that meet stringent environmental, quality and profitability constraints. In such integrated plants, non-linearity and strong process dynamic interactions among process units complicate root-cause diagnosis of plant-wide disturbances because disturbances may propagate to units at some distance away from the primary source of the upset. Similarly, implemented advanced process control strategies, backup and recovery systems, use of recycle streams and heat integration may hamper detection and diagnostic efforts. It is important to track down the root-cause of a plant-wide disturbance because once corrective action is taken at the source, secondary propagated effects can be quickly eliminated with minimum effort and reduced down time with the resultant positive impact on process efficiency, productivity and profitability. In order to diagnose the root-cause of disturbances that manifest plant-wide, it is crucial to incorporate and utilize knowledge about the overall process topology or interrelated physical structure of the plant, such as is contained in Piping and Instrumentation Diagrams (P&IDs). Traditionally, process control engineers have intuitively referred to the physical structure of the plant by visual inspection and manual tracing of fault propagation paths within the process structures, such as the process drawings on printed P&IDs, in order to make logical conclusions based on the results from data-driven analysis. This manual approach, however, is prone to various sources of errors and can quickly become complicated in real processes. The aim of this thesis, therefore, is to establish innovative techniques for the electronic capture and manipulation of process schematic information from large plants such as refineries in order to provide an automated means of diagnosing plant-wide performance problems. This report also describes the design and implementation of a computer application program that integrates: (i) process connectivity and directionality information from intelligent P&IDs (ii) results from data-driven cause-and-effect analysis of process measurements and (iii) process know-how to aid process control engineers and plant operators gain process insight. This work explored process intelligent P&IDs, created with AVEVA® P&ID, a Computer Aided Design (CAD) tool, and exported as an ISO 15926 compliant platform and vendor independent text-based XML description of the plant. The XML output was processed by a software tool developed in Microsoft® .NET environment in this research project to computationally generate connectivity matrix that shows plant items and their connections. The connectivity matrix produced can be exported to Excel® spreadsheet application as a basis for other application and has served as precursor to other research work. The final version of the developed software tool links statistical results of cause-and-effect analysis of process data with the connectivity matrix to simplify and gain insights into the cause and effect analysis using the connectivity information. Process knowhow and understanding is incorporated to generate logical conclusions. The thesis presents a case study in an atmospheric crude heating unit as an illustrative example to drive home key concepts and also describes an industrial case study involving refinery operations. In the industrial case study, in addition to confirming the root-cause candidate, the developed software tool was set the task to determine the physical sequence of fault propagation path within the plant. This was then compared with the hypothesis about disturbance propagation sequence generated by pure data-driven method. The results show a high degree of overlap which helps to validate statistical data-driven technique and easily identify any spurious results from the data-driven multivariable analysis. This significantly increase control engineers confidence in data-driven method being used for root-cause diagnosis. The thesis concludes with a discussion of the approach and presents ideas for further development of the methods

    Damage identification in structural health monitoring: a brief review from its implementation to the Use of data-driven applications

    Get PDF
    The damage identification process provides relevant information about the current state of a structure under inspection, and it can be approached from two different points of view. The first approach uses data-driven algorithms, which are usually associated with the collection of data using sensors. Data are subsequently processed and analyzed. The second approach uses models to analyze information about the structure. In the latter case, the overall performance of the approach is associated with the accuracy of the model and the information that is used to define it. Although both approaches are widely used, data-driven algorithms are preferred in most cases because they afford the ability to analyze data acquired from sensors and to provide a real-time solution for decision making; however, these approaches involve high-performance processors due to the high computational cost. As a contribution to the researchers working with data-driven algorithms and applications, this work presents a brief review of data-driven algorithms for damage identification in structural health-monitoring applications. This review covers damage detection, localization, classification, extension, and prognosis, as well as the development of smart structures. The literature is systematically reviewed according to the natural steps of a structural health-monitoring system. This review also includes information on the types of sensors used as well as on the development of data-driven algorithms for damage identification.Peer ReviewedPostprint (published version

    Smart Asset Management for Electric Utilities: Big Data and Future

    Full text link
    This paper discusses about future challenges in terms of big data and new technologies. Utilities have been collecting data in large amounts but they are hardly utilized because they are huge in amount and also there is uncertainty associated with it. Condition monitoring of assets collects large amounts of data during daily operations. The question arises "How to extract information from large chunk of data?" The concept of "rich data and poor information" is being challenged by big data analytics with advent of machine learning techniques. Along with technological advancements like Internet of Things (IoT), big data analytics will play an important role for electric utilities. In this paper, challenges are answered by pathways and guidelines to make the current asset management practices smarter for the future.Comment: 13 pages, 3 figures, Proceedings of 12th World Congress on Engineering Asset Management (WCEAM) 201

    Outline of a fault diagnosis system for a large-scale board machine

    Get PDF
    Global competition forces process industries to continuously optimize plant operation. One of the latest trends for efficiency and plant availability improvement is to set up fault diagnosis and maintenance systems for online industrial use. This paper presents a methodology for developing industrial fault detection and diagnosis (FDD) systems. Since model or data-based diagnosis of all components cannot be achieved online on a large-scale basis, the focus must be narrowed down to the most likely faulty components responsible for abnormal process behavior. One of the key elements here is fault analysis. The paper describes and briefly discusses also other development phases, process decomposition, and the selection of FDD methods. The paper ends with an FDD case study of a large-scale industrial board machine including a description of the fault analysis and FDD algorithms for the resulting focus areas. Finally, the testing and validation results are presented and discussed.Peer reviewe

    Intelligent Condition Monitoring of Industrial Plants: An Overview of Methodologies and Uncertainty Management Strategies

    Full text link
    Condition monitoring plays a significant role in the safety and reliability of modern industrial systems. Artificial intelligence (AI) approaches are gaining attention from academia and industry as a growing subject in industrial applications and as a powerful way of identifying faults. This paper provides an overview of intelligent condition monitoring and fault detection and diagnosis methods for industrial plants with a focus on the open-source benchmark Tennessee Eastman Process (TEP). In this survey, the most popular and state-of-the-art deep learning (DL) and machine learning (ML) algorithms for industrial plant condition monitoring, fault detection, and diagnosis are summarized and the advantages and disadvantages of each algorithm are studied. Challenges like imbalanced data, unlabelled samples and how deep learning models can handle them are also covered. Finally, a comparison of the accuracies and specifications of different algorithms utilizing the Tennessee Eastman Process (TEP) is conducted. This research will be beneficial for both researchers who are new to the field and experts, as it covers the literature on condition monitoring and state-of-the-art methods alongside the challenges and possible solutions to them

    Health monitoring of Gas turbine engines: Framework design and strategies

    Get PDF

    ADVANCES ON BILINEAR MODELING OF BIOCHEMICAL BATCH PROCESSES

    Full text link
    [EN] This thesis is aimed to study the implications of the statistical modeling approaches proposed for the bilinear modeling of batch processes, develop new techniques to overcome some of the problems that have not been yet solved and apply them to data of biochemical processes. The study, discussion and development of the new methods revolve around the four steps of the modeling cycle, from the alignment, preprocessing and calibration of batch data to the monitoring of batches trajectories. Special attention is given to the problem of the batch synchronization, and its effect on the modeling from different angles. The manuscript has been divided into four blocks. First, a state-of- the-art of the latent structures based-models in continuous and batch processes and traditional univariate and multivariate statistical process control systems is carried out. The second block of the thesis is devoted to the preprocessing of batch data, in particular, to the equalization and synchronization of batch trajectories. The first section addresses the problem of the lack of equalization in the variable trajectories. The different types of unequalization scenarios that practitioners might finnd in batch processes are discussed and the solutions to equalize batch data are introduced. In the second section, a theoretical study of the nature of batch processes and of the synchronization of batch trajectories as a prior step to bilinear modeling is carried out. The topics under discussion are i) whether the same synchronization approach must be applied to batch data in presence of different types of asynchronisms, and ii) whether synchronization is always required even though the length of the variable trajectories are constant across batches. To answer these questions, a thorough study of the most common types of asynchronisms that may be found in batch data is done. Furthermore, two new synchronization techniques are proposed to solve the current problems in post-batch and real-time synchronization. To improve fault detection and classification, new unsupervised control charts and supervised fault classifiers based on the information generated by the batch synchronization are also proposed. In the third block of the manuscript, a research work is performed on the parameter stability associated with the most used synchronization methods and principal component analysis (PCA)-based Batch Multivariate Statistical Process Control methods. The results of this study have revealed that accuracy in batch synchronization has a profound impact on the PCA model parameters stability. Also, the parameter stability is closely related to the type of preprocessing performed in batch data, and the type of model and unfolding used to transform the three-way data structure to two-way. The setting of the parameter stability, the source of variability remaining after preprocessing and the process dynamics should be balanced in such a way that multivariate statistical models are accurate in fault detection and diagnosis and/or in online prediction. Finally, the fourth block introduces a graphical user-friendly interface developed in Matlab code for batch process understanding and monitoring. To perform multivariate analysis, the last developments in process chemometrics, including the methods proposed in this thesis, are implemented.[ES] La presente tesis doctoral tiene como objetivo estudiar las implicaciones de los métodos estadísticos propuestos para la modelización bilineal de procesos por lotes, el desarrollo de nuevas técnicas para solucionar algunos de los problemas más complejos aún por resolver en esta línea de investigación y aplicar los nuevos métodos a datos provenientes de procesos bioquímicos para su evaluación estadística. El estudio, la discusión y el desarrollo de los nuevos métodos giran en torno a las cuatro fases del ciclo de modelización: desde la sincronización, ecualización, preprocesamiento y calibración de los datos, a la monitorización de las trayectorias de las variables del proceso. Se presta especial atención al problema de la sincronización y su efecto en la modelización estadística desde distintas perspectivas. El manuscrito se ha dividido en cuatro grandes bloques. En primer lugar, se realiza una revisión bibliográfica de las técnicas de proyección sobre estructuras latentes para su aplicación en procesos continuos y por lotes, y del diseño de sistemas de control basados en modelos estadísticos multivariantes. El segundo bloque del documento versa sobre el preprocesamiento de los datos, en concreto, sobre la ecualización y la sincronización. La primera parte aborda el problema de la falta de ecualización en las trayectorias de las variables. Se discuten las diferentes políticas de muestreo que se pueden encontrar en procesos por lotes y las soluciones para ecualizar las variables. En la segunda parte de esta sección, se realiza un estudio teórico sobre la naturaleza de los procesos por lotes y de la sincronización de las trayectorias como paso previo a la modelización bilineal. Los temas bajo discusión son: i) si se debe utilizar el mismo enfoque de sincronización en lotes afectados por diferentes tipos de asincronismos, y ii) si la sincronización es siempre necesaria aún y cuando las trayectorias de las variables tienen la misma duración en todos los lotes. Para responder a estas preguntas, se lleva a cabo un estudio exhaustivo de los tipos más comunes de asincronismos que se pueden encontrar en este tipo de datos. Además, se proponen dos nuevas técnicas de sincronización para resolver los problemas existentes en aplicaciones post-morten y en tiempo real. Para mejorar la detección de fallos y la clasificación, también se proponen nuevos gráficos de control no supervisados y clasificadores de fallos supervisados en base a la información generada por la sincronización de los lotes. En el tercer bloque del manuscrito se realiza un estudio de la estabilidad de los parámetros asociados a los métodos de sincronización y a los métodos estadístico multivariante basados en el Análisis de Componentes Principales (PCA) más utilizados para el control de procesos. Los resultados de este estudio revelan que la precisión de la sincronización de las trayectorias tiene un impacto significativo en la estabilidad de los parámetros de los modelos PCA. Además, la estabilidad paramétrica está estrechamente relacionada con el tipo de preprocesamiento realizado en los datos de los lotes, el tipo de modelo a justado y el despliegue utilizado para transformar la estructura de datos de tres a dos dimensiones. El ajuste de la estabilidad de los parámetros, la fuente de variabilidad que queda después del preprocesamiento de los datos y la captura de las dinámicas del proceso deben ser a justados de forma equilibrada de tal manera que los modelos estadísticos multivariantes sean precisos en la detección y diagnóstico de fallos y/o en la predicción en tiempo real. Por último, el cuarto bloque del documento describe una interfaz gráfica de usuario que se ha desarrollado en código Matlab para la comprensión y la supervisión de procesos por lotes. Para llevar a cabo los análisis multivariantes, se han implementado los últimos desarrollos en la quimiometría de proc[CA] Aquesta tesi doctoral te com a objectiu estudiar les implicacions dels mètodes de modelització estadística proposats per a la modelització bilineal de processos per lots, el desenvolupament de noves tècniques per resoldre els problemes encara no resolts en aquesta línia de recerca i aplicar els nous mètodes a les dades dels processos bioquímics. L'estudi, la discussió i el desenvolupament dels nous mètodes giren entorn a les quatre fases del cicle de modelització, des de l'alineació, preprocessament i el calibratge de les dades provinents de lots, a la monitorització de les trajectòries. Es presta especial atenció al problema de la sincronització per lots, i el seu efecte sobre el modelatge des de diferents angles. El manuscrit s'ha dividit en quatre grans blocs. En primer lloc, es realitza una revisió bibliogràfica dels principals mètodes basats en tècniques de projecció sobre estructures latents en processos continus i per lots, així com dels sistemes de control estadístics multivariats. El segon bloc del document es dedica a la preprocessament de les dades provinents de lots, en particular, l' equalització i la sincronització. La primera part aborda el problema de la manca d'equalització en les trajectòries de les variables. Es discuteixen els diferents tipus d'escenaris en que les variables estan mesurades a distints intervals i les solucions per equalitzar-les en processos per lots. A la segona part d'aquesta secció es porta a terme un estudi teòric de la naturalesa dels processos per lots i de la sincronització de les trajectòries de lots com a pas previ al modelatge bilineal. Els temes en discussió són: i) si el mateix enfocament de sincronització ha de ser aplicat a les dades del lot en presència de diferents tipus de asincronismes, i ii) si la sincronització sempre es requereix tot i que la longitud de les trajectòries de les variables són constants en tots el lots. Per respondre a aquestes preguntes, es du a terme un estudi exhaustiu dels tipus més comuns de asincronismes que es poden trobar en les dades provinents de lots. A més, es proposen dues noves tècniques de sincronització per resoldre els problemes existents la sincronització post-morten i en temps real. Per millorar la detecció i la classificació de anomalies, també es proposen nous gràfics de control no supervisats i classificadors de falla supervisats dissenyats en base a la informació generada per la sincronització de lots. En el tercer bloc del manuscrit es realitza un treball de recerca sobre l'estabilitat dels paràmetres associats als mètodes de sincronització i als mètodes estadístics multivariats basats en l'Anàlisi de Components Principals (PCA) més utilitzats per al control de processos. Els resultats d'aquest estudi revelen que la precisió en la sincronització per lots te un profund impacte en l'estabilitat dels paràmetres dels models PCA. A més, l'estabilitat paramètrica està estretament relacionat amb el tipus de preprocessament realitzat en les dades provinents de lots, el tipus de model i el desplegament utilitzat per transformar l'estructura de dades de tres a dos dimensions. L'ajust de l'estabilitat dels paràmetres, la font de variabilitat que queda després del preprocessament i la captura de la dinàmica de procés ha de ser equilibrada de tal manera que els models estadístics multivariats són precisos en la detecció i diagnòstic de fallades i/o en la predicció en línia. Finalment, el quart bloc del document introdueix una interfície gràfica d'usuari que s'ha dissenyat e implementat en Matlab per a la comprensió i la supervisió de processos per lots. Per dur a terme aquestes anàlisis multivariats, s'han implementat els últims desenvolupaments en la quimiometria de processos, incloent-hi els mètodes proposats en aquesta tesi.González Martínez, JM. (2015). ADVANCES ON BILINEAR MODELING OF BIOCHEMICAL BATCH PROCESSES [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/55684TESISPremios Extraordinarios de tesis doctorale

    Explainable Predictive Maintenance

    Full text link
    Explainable Artificial Intelligence (XAI) fills the role of a critical interface fostering interactions between sophisticated intelligent systems and diverse individuals, including data scientists, domain experts, end-users, and more. It aids in deciphering the intricate internal mechanisms of ``black box'' Machine Learning (ML), rendering the reasons behind their decisions more understandable. However, current research in XAI primarily focuses on two aspects; ways to facilitate user trust, or to debug and refine the ML model. The majority of it falls short of recognising the diverse types of explanations needed in broader contexts, as different users and varied application areas necessitate solutions tailored to their specific needs. One such domain is Predictive Maintenance (PdM), an exploding area of research under the Industry 4.0 \& 5.0 umbrella. This position paper highlights the gap between existing XAI methodologies and the specific requirements for explanations within industrial applications, particularly the Predictive Maintenance field. Despite explainability's crucial role, this subject remains a relatively under-explored area, making this paper a pioneering attempt to bring relevant challenges to the research community's attention. We provide an overview of predictive maintenance tasks and accentuate the need and varying purposes for corresponding explanations. We then list and describe XAI techniques commonly employed in the literature, discussing their suitability for PdM tasks. Finally, to make the ideas and claims more concrete, we demonstrate XAI applied in four specific industrial use cases: commercial vehicles, metro trains, steel plants, and wind farms, spotlighting areas requiring further research.Comment: 51 pages, 9 figure
    corecore