64 research outputs found
Classifier Ensemble Feature Selection for Automatic Fault Diagnosis
"An efficient ensemble feature selection scheme applied for fault diagnosis is
proposed, based on three hypothesis:
a. A fault diagnosis system does not need to be restricted to a single feature
extraction model, on the contrary, it should use as many feature models as
possible, since the extracted features are potentially discriminative and the
feature pooling is subsequently reduced with feature selection;
b. The feature selection process can be accelerated, without loss of classification
performance, combining feature selection methods, in a way that faster and
weaker methods reduce the number of potentially non-discriminative features,
sending to slower and stronger methods a filtered smaller feature set;
c. The optimal feature set for a multi-class problem might be different for each
pair of classes. Therefore, the feature selection should be done using an one
versus one scheme, even when multi-class classifiers are used. However, since
the number of classifiers grows exponentially to the number of the classes,
expensive techniques like Error-Correcting Output Codes (ECOC) might have
a prohibitive computational cost for large datasets. Thus, a fast one versus one
approach must be used to alleviate such a computational demand.
These three hypothesis are corroborated by experiments.
The main hypothesis of this work is that using these three approaches
together is possible to improve significantly the classification performance of a
classifier to identify conditions in industrial processes. Experiments have shown such
an improvement for the 1-NN classifier in industrial processes used as case study.
A Machine Learning-based Distributed System for Fault Diagnosis with Scalable Detection Quality in Industrial IoT
In this paper, a methodology based on machine
learning for fault detection in continuous processes is presented.
It aims to monitor fully distributed scenarios, such as the
Tennessee Eastman Process, selected as the use case of this work,
where sensors are distributed throughout an industrial plant. A
hybrid feature selection approach based on filters and wrappers,
called Hybrid Fisher Wrapper method, is proposed to select the
most representative sensors to get the highest detection quality
for fault identification. The proposed methodology provides a
complete design space of solutions differing in the sensing effort,
the processing complexity, and the obtained detection quality.
It constitutes an alternative to the typical scheme in Industry
4.0, where multiple distributed sensor systems collect and send
data to a centralised cloud. Differently, the proposed technique
follows a distributed approach, in which processing can be done
eventually close to the sensors where data is generated, i.e., at
the edge of the Internet of Things. This approach overcomes
the bandwidth, privacy, and latency limitations that centralised
approaches may suffer. The experimental results show that
the proposed methodology provides Tennessee Eastman Process
fault detection solutions with state-of-the-art detection quality
figures. In terms of latency, solutions obtained outperform in
37.5 times the implementation with the highest detection quality,
using 1.99 times fewer features, on average. Also, the scalability
of the framework provides a design space where the optimal
implementation can be chosen according to the application needs
Deep Learning-Based Machinery Fault Diagnostics
This book offers a compilation for experts, scholars, and researchers to present the most recent advancements, from theoretical methods to the applications of sophisticated fault diagnosis techniques. The deep learning methods for analyzing and testing complex mechanical systems are of particular interest. Special attention is given to the representation and analysis of system information, operating condition monitoring, the establishment of technical standards, and scientific support of machinery fault diagnosis
Recommended from our members
Data cleaning and knowledge discovery in process data
This dissertation presents several methods for overcoming the Big Data challenges, with an emphasis on data cleaning and knowledge discovery in process data. Data cleaning and knowledge discovery is chosen as a main research area here due to its importance from both theoretical and practical points of view.
Theoretical background and recent developments of data cleaning methods are reviewed from four aspects: missing data imputation, outlier detection, noise removal and time delay estimation. Moreover, the impact of contaminated data on model performance and corresponding improvement obtained by data cleaning methods are analyzed through both simulated and industrial case studies. The results provide a starting point for further advanced methodology development.
It is hard to find a universally applicable method for data cleaning since every data set may have its own distinctive features. Thus, we have to customize available methods so that the quality of the data set is guaranteed. An integrated data cleaning scheme is proposed, which incorporates model building and performance evaluation, to provide guidance in tuning the parameters of data cleaning methods and prevent over-cleaning. A case study based on industrial data has been used to verify the feasibility and effectiveness of the proposed new method, during which a partial least squares (PLS) model was built and three univariate data cleaning procedures is tested.
A time series Kalman filter (TSKF) is proposed that successfully handles outlier detection in dynamic systems, where normal process changes often mask the existence of outliers. The TSKF method combines a time series model fitting procedure with a modified Kalman filter to deal with additive outlier (AO) and innovational outlier (IO) detection problems in dynamic process data set. A comparative analysis of TSKF and available methods is performed on simulated and real chemical plant data.
Root cause diagnosis of plant-wide oscillations, as a concrete example of data cleaning and knowledge discovery in the process data, is provided. Plant-wide oscillations can negatively influence the overall control performance of the process and the detection results are often affected by noise at different frequency ranges. To address such a problem, an information transfer method combining spectral envelope algorithm with spectral transfer entropy is proposed to detect and diagnose such oscillations within a specific frequency range, mitigating the effects from measurement noise. The feasibility and effectiveness of the proposed method are verified and compared with available methods through both simulated and industrial case studies.Chemical Engineerin
Advanced Process Monitoring for Industry 4.0
This book reports recent advances on Process Monitoring (PM) to cope with the many challenges raised by the new production systems, sensors and “extreme data” conditions that emerged with Industry 4.0. Concepts such as digital-twins and deep learning are brought to the PM arena, pushing forward the capabilities of existing methodologies to handle more complex scenarios. The evolution of classical paradigms such as Latent Variable modeling, Six Sigma and FMEA are also covered. Applications span a wide range of domains such as microelectronics, semiconductors, chemicals, materials, agriculture, as well as the monitoring of rotating equipment, combustion systems and membrane separation processes
Earth Resources. A continuing bibliography with indexes, issue 34, July 1982
This bibliography lists 567 reports, articles, and other documents introduced into the NASA Scientific and Technical Information System between April 1, and June 30, 1982. Emphasis is placed on the use of remote sensing and geophysical instrumentation in spacecraft and aircraft to survey and inventory natural resources and urban areas. Subject matter is grouped according to agriculture and forestry, environmental changes and cultural resources, geodesy and cartography, geology and mineral resources, hydrology and water management, data processing and distribution systems, instrumentation and sensors, and economic analysis
Technology 2001: The Second National Technology Transfer Conference and Exposition, volume 1
Papers from the technical sessions of the Technology 2001 Conference and Exposition are presented. The technical sessions featured discussions of advanced manufacturing, artificial intelligence, biotechnology, computer graphics and simulation, communications, data and information management, electronics, electro-optics, environmental technology, life sciences, materials science, medical advances, robotics, software engineering, and test and measurement
Graduate Catalog, 2000-2002
Marshall University Undergraduate Graduate Course Catalog for the 2000-2002 academic years.https://mds.marshall.edu/catalog_2000-2009/1020/thumbnail.jp
State of New Hampshire. Reports, 1909-1910, volume IV.- Biennial
Sometimes issued both annually and biennially; Each vol. contains the reports of various departments of the government of the state of New Hampshire; Includes attorneys general\u27s opinion
State of New Hampshire. Reports, 1909-1910, volume IV.- Biennial
Sometimes issued both annually and biennially; Each vol. contains the reports of various departments of the government of the state of New Hampshire; Includes attorneys general\u27s opinion
- …