29 research outputs found
Enhanced Industrial Machinery Condition Monitoring Methodology based on Novelty Detection and Multi-Modal Analysis
This paper presents a condition-based monitoring methodology based on novelty detection applied to industrial machinery. The proposed approach includes both, the classical classification of multiple a priori known scenarios, and the innovative detection capability of new operating modes not previously available. The development of condition-based monitoring methodologies considering the isolation capabilities of unexpected scenarios represents, nowadays, a trending topic able to answer the demanding requirements of the future industrial processes monitoring systems. First, the method is based on the temporal segmentation of the available physical magnitudes, and the estimation of a set of time-based statistical features. Then, a double feature reduction stage based on Principal Component Analysis and Linear Discriminant Analysis is applied in order to optimize the classification and novelty detection performances. The posterior combination of a Feed-forward Neural Network and One-Class Support Vector Machine allows the proper interpretation of known and unknown operating conditions. The effectiveness of this novel condition monitoring scheme has been verified by experimental results obtained from an automotive industry machine.Postprint (published version
The machine abnormal degree detection method based on SVDD and negative selection mechanism
As is well-known, fault samples are essential for the fault diagnosis and anomaly detection, but in most cases, it is difficult to obtain them. The negative selection mechanism of immune system, which can distinguish almost all nonself cells or molecules with only the self cells, gives us an inspiration to solve the problem of anomaly detection with only the normal samples. In this paper, we introduced the Support Vector Data Description (SVDD) and negative selection mechanism to separate the state space of machines into self, non-self and fault space. To estimate the abnormal level of machines, a function that could calculate the abnormal degree was constructed and its sensitivity change according to the change of abnormal degree was also discussed. At last, Iris-Fisher and ball bearing fault data set were used to verify the effectiveness of this method
Evaluation of Novelty Detection Methods for Condition Monitoring applied to an Electromechanical System
Dealing with industrial applications, the implementation of condition monitoring schemes must overcome a critical limitation, that is, the lack of a priori information about fault patterns of the system under analysis. Indeed, classical diagnosis schemes, in general, outdo the membership probability of a measure in regard to predefined operating scenarios. However, dealing with noncharacterized systems, the knowledge about faulty operating scenarios is limited and, consequently, the diagnosis performance is insufficient. In this context, the novelty detection framework plays an essential role for monitoring systems in which the information about different operating scenarios is initially unavailable or restricted. The novelty detection approach begins with the assumption that only data corresponding to the healthy operation of the system under analysis is available. Thus, the challenge is to detect and learn additional scenarios during the operation of the system in order to complement the information obtained by the diagnosis scheme. This work has two main objectives: first, the presentation of novelty detection as the current trend toward the new paradigm of industrial condition monitoring and, second, the introduction to its applicability by means of analyses of different novelty detection strategies over a real industrial system based on rotatory machinery
A Predictive maintenance model for heterogeneous industrial refrigeration systems
The automatic assessment of the degradation state of industrial refrigeration systems is
becoming increasingly important and constitutes a key-role within predictive maintenance
approaches. Lately, data-driven methods especially became the focus of research in this
respect. As they only rely on historical data in the development phase, they offer great
advantages in terms of flexibility and generalisability by circumventing the need for specific
domain knowledge. While most scientific contributions employ methods emerging from
the field of machine learning (ML), only very few consider their applicability amongst
different heterogeneous systems. In fact, the majority of existing contributions in this field
solely apply supervised ML models, which assume the availability of labelled fault data for
each system respectively. However, this places restrictions on the overall applicability, as
data labelling is mostly conducted by humans and therefore constitutes a non-negligible
cost and time factor. Moreover, such methods assume that all considered fault types
occurred in the past, a condition that may not always be guaranteed to be satisfied.
Therefore, this dissertation proposes a predictive maintenance model for industrial
refrigeration systems by especially addressing its transferability onto different but related heterogeneous systems. In particular, it aims at solving a sub-problem known as
condition-based maintenance (CBM) to automatically assess the system’s state of degradation. To this end, the model does not only estimate how far a possible malfunction
has progressed, but also determines the fault type being present. As will be described
in greater detail throughout this dissertation, the proposed model also utilises techniques
from the field of ML but rather bypasses the strict assumptions accompanying supervised
ML. Accordingly, it assumes the data of the target system to be primarily unlabelled
while a few labelled samples are expected to be retrievable from the fault-free operational
state, which can be obtained at low cost. Yet, to enable the model’s intended functionality, it additionally employs data from only one fully labelled source dataset and, thus,
allows the benefits of data-driven approaches towards predictive maintenance to be further
exploited.
After the introduction, the dissertation at hand introduces the related concepts as
well as the terms and definitions and delimits this work from other fields of research.
Furthermore, the scope of application is further introduced and the latest scientific work
is presented. This is then followed by the explanation of the open research gap, from which
the research questions are derived. The third chapter deals with the main principles of the
model, including the mathematical notations and the individual concepts. It furthermore
delivers an overview about the variety of problems arising in this context and presents the
associated solutions from a theoretical point of view. Subsequently, the data acquisition
phase is described, addressing both the data collection procedure and the outcome of the
test cases. In addition, the considered fault characteristics are presented and compared
with the ones obtained from the related publicly available dataset. In essence, both
datasets form the basis for the model validation, as discussed in the following chapter. This
chapter then further comprises the results obtained from the model, which are compared
with the ones retrieved from several baseline models derived from the literature. This
work then closes with a summary and the conclusions drawn from the model results.
Lastly, an outlook of the presented dissertation is provide
EDMON - Electronic Disease Surveillance and Monitoring Network: A Personalized Health Model-based Digital Infectious Disease Detection Mechanism using Self-Recorded Data from People with Type 1 Diabetes
Through time, we as a society have been tested with infectious disease outbreaks of different magnitude, which often pose major public health challenges. To mitigate the challenges, research endeavors have been focused on early detection mechanisms through identifying potential data sources, mode of data collection and transmission, case and outbreak detection methods. Driven by the ubiquitous nature of smartphones and wearables, the current endeavor is targeted towards individualizing the surveillance effort through a personalized health model, where the case detection is realized by exploiting self-collected physiological data from wearables and smartphones.
This dissertation aims to demonstrate the concept of a personalized health model as a case detector for outbreak detection by utilizing self-recorded data from people with type 1 diabetes. The results have shown that infection onset triggers substantial deviations, i.e. prolonged hyperglycemia regardless of higher insulin injections and fewer carbohydrate consumptions. Per the findings, key parameters such as blood glucose level, insulin, carbohydrate, and insulin-to-carbohydrate ratio are found to carry high discriminative power. A personalized health model devised based on a one-class classifier and unsupervised method using selected parameters achieved promising detection performance. Experimental results show the superior performance of the one-class classifier and, models such as one-class support vector machine, k-nearest neighbor and, k-means achieved better performance. Further, the result also revealed the effect of input parameters, data granularity, and sample sizes on model performances.
The presented results have practical significance for understanding the effect of infection episodes amongst people with type 1 diabetes, and the potential of a personalized health model in outbreak detection settings. The added benefit of the personalized health model concept introduced in this dissertation lies in its usefulness beyond the surveillance purpose, i.e. to devise decision support tools and learning platforms for the patient to manage infection-induced crises
Development of a Pharmaceutical Tablet Authentication system using Spectroscopic Techniques in combination with Multivariate Chemometric Methods
The spread of falsified drugs is increasing worldwide. Currently, 10%-30% of drugs in the world are falsified. Unfortunately, the supply chain system of developing countries (from manufacturers to customers) is not well monitored and suffers the highest rates of fraudulent activities. Also, the rise of product procurement from the internet increases the chance of American consumers\u27 exposure to poor quality drugs. To combat this horrendous activity, surveillance of pharmaceutical materials is required in the supply chain system. Spectroscopic techniques (e.g., Near-Infrared and Raman spectroscopies) can be a potential solution to authenticate samples in different locations; they are non-destructive, safe, rapid, portable, and affordable. However, before deploying these techniques in the field, rigorous method development is required with the help of chemometrics. The chemometric tools to be used should declare the test sample as either target class (authentic samples) or non-target class (alternate class or falsified samples). One challenge of method development is the sensitivity of spectrometers toward unwanted variabilities including moisture, batch to batch variabilities, raw material variabilities, etc. Initial research for this dissertation observed that the traditional chemometric methods, which are based on distribution assumptions, provided a high number of false negatives due to violation of distribution assumptions in the presence of unwanted variations. It was demonstrated that adding samples from different seasons in the calibration set created binominal or multimodal distributions due to moisture variations - which violated the assumptions of the principal component analysis based soft independent modeling of class analogy (SIMCA) method. Hence, the support vector data description (SVDD) method—which has no distribution assumption—is proposed for use as a class modeling approach for authentication purposes. Implementing the SVDD algorithm improved model performance by reducing false negatives relative to the traditional multivariate class modeling approach (i.e., SIMCA). In addition, while developing the SVDD method, this dissertation suggested using different non-target class samples produced by competitor manufacturers or synthetic samples generated in the laboratory using design of experiment (DOE) as test or validation set to decrease false positives. This work also evaluated several commercially available products in two local pharmacies using portable spectrometers to validate the practical usefulness of the proposed methods. To summarize, the research performed for this dissertation has demonstrated the value of critical prior knowledge regarding pharmaceutical products, pharmaceutical manufacturing/processing, analytical methodology, and advanced chemometric techniques in a unique way to develop a successful spectroscopic authentication system
Exploring variability in medical imaging
Although recent successes of deep learning and novel machine learning techniques improved the perfor-
mance of classification and (anomaly) detection in computer vision problems, the application of these
methods in medical imaging pipeline remains a very challenging task. One of the main reasons for this
is the amount of variability that is encountered and encapsulated in human anatomy and subsequently
reflected in medical images. This fundamental factor impacts most stages in modern medical imaging
processing pipelines.
Variability of human anatomy makes it virtually impossible to build large datasets for each disease
with labels and annotation for fully supervised machine learning. An efficient way to cope with this is
to try and learn only from normal samples. Such data is much easier to collect. A case study of such
an automatic anomaly detection system based on normative learning is presented in this work. We
present a framework for detecting fetal cardiac anomalies during ultrasound screening using generative
models, which are trained only utilising normal/healthy subjects.
However, despite the significant improvement in automatic abnormality detection systems, clinical
routine continues to rely exclusively on the contribution of overburdened medical experts to diagnosis
and localise abnormalities. Integrating human expert knowledge into the medical imaging processing
pipeline entails uncertainty which is mainly correlated with inter-observer variability. From the per-
spective of building an automated medical imaging system, it is still an open issue, to what extent
this kind of variability and the resulting uncertainty are introduced during the training of a model
and how it affects the final performance of the task. Consequently, it is very important to explore the
effect of inter-observer variability both, on the reliable estimation of model’s uncertainty, as well as
on the model’s performance in a specific machine learning task. A thorough investigation of this issue
is presented in this work by leveraging automated estimates for machine learning model uncertainty,
inter-observer variability and segmentation task performance in lung CT scan images.
Finally, a presentation of an overview of the existing anomaly detection methods in medical imaging
was attempted. This state-of-the-art survey includes both conventional pattern recognition methods
and deep learning based methods. It is one of the first literature surveys attempted in the specific
research area.Open Acces
Recommended from our members
A Review and Analysis of Automatic Optical Inspection and Quality Monitoring Methods in Electronics Industry
Electronics industry is one of the fastest evolving, innovative, and most competitive industries. In order to meet the high consumption demands on electronics components, quality standards of the products must be well-maintained. Automatic optical inspection (AOI) is one of the non-destructive techniques used in quality inspection of various products. This technique is considered robust and can replace human inspectors who are subjected to dull and fatigue in performing inspection tasks. A fully automated optical inspection system consists of hardware and software setups. Hardware setup include image sensor and illumination settings and is responsible to acquire the digital image, while the software part implements an inspection algorithm to extract the features of the acquired images and classify them into defected and non-defected based on the user requirements. A sorting mechanism can be used to separate the defective products from the good ones. This article provides a comprehensive review of the various AOI systems used in electronics, micro-electronics, and opto-electronics industries. In this review the defects of the commonly inspected electronic components, such as semiconductor wafers, flat panel displays, printed circuit boards and light emitting diodes, are first explained. Hardware setups used in acquiring images are then discussed in terms of the camera and lighting source selection and configuration. The inspection algorithms used for detecting the defects in the electronic components are discussed in terms of the preprocessing, feature extraction and classification tools used for this purpose. Recent articles that used deep learning algorithms are also reviewed. The article concludes by highlighting the current trends and possible future research directions.Framework of the IQONIC Project; European Union’s Horizon 2020 Research and Innovation Program