28,419 research outputs found
Application of Computational Intelligence Techniques to Process Industry Problems
In the last two decades there has been a large progress in the computational
intelligence research field. The fruits of the effort spent on the research in the discussed
field are powerful techniques for pattern recognition, data mining, data modelling, etc.
These techniques achieve high performance on traditional data sets like the UCI
machine learning database. Unfortunately, this kind of data sources usually represent
clean data without any problems like data outliers, missing values, feature co-linearity,
etc. common to real-life industrial data. The presence of faulty data samples can have
very harmful effects on the models, for example if presented during the training of the
models, it can either cause sub-optimal performance of the trained model or in the worst
case destroy the so far learnt knowledge of the model. For these reasons the application
of present modelling techniques to industrial problems has developed into a research
field on its own. Based on the discussion of the properties and issues of the data and the
state-of-the-art modelling techniques in the process industry, in this paper a novel
unified approach to the development of predictive models in the process industry is
presented
Data-driven Soft Sensors in the Process Industry
In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work
A survey of outlier detection methodologies
Outlier detection has been used for centuries to detect and, where appropriate, remove anomalous observations from data. Outliers arise due to mechanical faults, changes in system behaviour, fraudulent behaviour, human error, instrument error or simply through natural deviations in populations. Their detection can identify system faults and fraud before they escalate with potentially catastrophic consequences. It can identify errors and remove their contaminating effect on the data set and as such to purify the data for processing. The original outlier detection methods were arbitrary but now, principled and systematic techniques are used, drawn from the full gamut of Computer Science and Statistics. In this paper, we introduce a survey of contemporary techniques for outlier detection. We identify their respective motivations and distinguish their advantages and disadvantages in a comparative review
AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments
This report considers the application of Articial Intelligence (AI) techniques to
the problem of misuse detection and misuse localisation within telecommunications
environments. A broad survey of techniques is provided, that covers inter alia
rule based systems, model-based systems, case based reasoning, pattern matching,
clustering and feature extraction, articial neural networks, genetic algorithms, arti
cial immune systems, agent based systems, data mining and a variety of hybrid
approaches. The report then considers the central issue of event correlation, that
is at the heart of many misuse detection and localisation systems. The notion of
being able to infer misuse by the correlation of individual temporally distributed
events within a multiple data stream environment is explored, and a range of techniques,
covering model based approaches, `programmed' AI and machine learning
paradigms. It is found that, in general, correlation is best achieved via rule based approaches,
but that these suffer from a number of drawbacks, such as the difculty of
developing and maintaining an appropriate knowledge base, and the lack of ability
to generalise from known misuses to new unseen misuses. Two distinct approaches
are evident. One attempts to encode knowledge of known misuses, typically within
rules, and use this to screen events. This approach cannot generally detect misuses
for which it has not been programmed, i.e. it is prone to issuing false negatives.
The other attempts to `learn' the features of event patterns that constitute normal
behaviour, and, by observing patterns that do not match expected behaviour, detect
when a misuse has occurred. This approach is prone to issuing false positives,
i.e. inferring misuse from innocent patterns of behaviour that the system was not
trained to recognise. Contemporary approaches are seen to favour hybridisation,
often combining detection or localisation mechanisms for both abnormal and normal
behaviour, the former to capture known cases of misuse, the latter to capture
unknown cases. In some systems, these mechanisms even work together to update
each other to increase detection rates and lower false positive rates. It is concluded
that hybridisation offers the most promising future direction, but that a rule or state
based component is likely to remain, being the most natural approach to the correlation
of complex events. The challenge, then, is to mitigate the weaknesses of
canonical programmed systems such that learning, generalisation and adaptation
are more readily facilitated
Crack detection in a rotating shaft using artificial neural networks and PSD characterisation
Peer reviewedPostprin
Diesel engine fuel injection monitoring using acoustic measurements and independent component analysis
Air-borne acoustic based condition monitoring is a promising technique because of its intrusive nature and the rich information contained within the acoustic signals including all sources. However, the back ground noise contamination, interferences and the number of Internal Combustion Engine ICE vibro-acoustic sources preclude the extraction of condition information using this technique. Therefore, lower energy events; such as fuel injection, are buried within higher energy events and/or corrupted by background noise.
This work firstly investigates diesel engine air-borne acoustic signals characteristics and the benefits of joint time-frequency domain analysis. Secondly, the air-borne acoustic signals in the vicinity of injector head were recorded using three microphones around the fuel injector (120° apart from each other) and an Independent Component Analysis (ICA) based scheme was developed to decompose these acoustic signals. The fuel injection process characteristics were thus revealed in the time-frequency domain using Wigner-Ville distribution (WVD) technique. Consequently the energy levels around the injection process period between 11 and 5 degrees before the top dead center and of frequency band 9 to 15 kHz are calculated. The developed technique was validated by simulated signals and empirical measurements at different injection pressure levels from 250 to 210 bars in steps of 10 bars. The recovered energy levels in the tested conditions were found to be affected by the injector pressure settings
- …