38 research outputs found

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Prognostic Algorithms for Condition Monitoring and Remaining Useful Life Estimation

    Get PDF
    To enable the benets of a truly condition-based maintenance philosophy to be realised, robust, accurate and reliable algorithms, which provide maintenance personnel with the necessary information to make informed maintenance decisions, will be key. This thesis focuses on the development of such algorithms, with a focus on semiconductor manufacturing and wind turbines. An introduction to condition-based maintenance is presented which reviews dierent types of maintenance philosophies and describes the potential benets which a condition- based maintenance philosophy will deliver to operators of critical plant and machinery. The issues and challenges involved in developing condition-based maintenance solutions are discussed and a review of previous approaches and techniques in fault diagnostics and prognostics is presented. The development of a condition monitoring system for dry vacuum pumps used in semi- conductor manufacturing is presented. A notable feature is that upstream process mea- surements from the wafer processing chamber were incorporated in the development of a solution. In general, semiconductor manufacturers do not make such information avail- able and this study identies the benets of information sharing in the development of condition monitoring solutions, within the semiconductor manufacturing domain. The developed solution provides maintenance personnel with the ability to identify, quantify, track and predict the remaining useful life of pumps suering from degradation caused by pumping large volumes of corrosive uorine gas. A comprehensive condition monitoring solution for thermal abatement systems is also presented. As part of this work, a multiple model particle ltering algorithm for prog- nostics is developed and tested. The capabilities of the proposed prognostic solution for addressing the uncertainty challenges in predicting the remaining useful life of abatement systems, subject to uncertain future operating loads and conditions, is demonstrated. Finally, a condition monitoring algorithm for the main bearing on large utility scale wind turbines is developed. The developed solution exploits data collected by onboard supervisory control and data acquisition (SCADA) systems in wind turbines. As a result, the developed solution can be integrated into existing monitoring systems, at no additional cost. The potential for the application of multiple model particle ltering algorithm to wind turbine prognostics is also demonstrated

    Strategies for non-uniform rate sampling in digital control theory

    Get PDF
    This thesis is about digital control theory and presents an account of methods for enabling and analysing intentional non-uniform sampling in discrete compensators. Most conventional control algorithms cause numerical problems where data is collected at sampling rates that are substantially higher than the dynamics of the equivalent continuous-time operation that is being implemented. This is of relevant interest in applications of digital control, in which high sample rates are routinely dictated by the system stability requirements rather than the signal processing needs. Considerable recent progress in reducing the sample frequency requirements has been made through the use of non-uniform sampling schemes, so called alias-free signal processing. The approach prompts the simplification of complex systems and consequently enhances the numerical conditioning of the implementation algorithms that otherwise, would require very high uniform sample rates. Such means of signal representation and analysis presents a variety of options and thus is being researched and practiced in a number of areas in communications. However, the control communities have not yet investigated the use of intentional non-uniform sampling, and hence the ethos of this research project is to investigate the effectiveness of such sampling regimes, in the context of exploiting the benefits. Digital control systems exhibit bandwidth limitations enforced by their closed-loop frequency requirements, the calculation delays in the control algorithm and the interfacing conversion times. These limitations pave the way for additional phase lags within the control loop that demand very high sample rates. Since non-uniform sampling is propitious in reducing the sample frequency requirements of digital processing, it proffers the prospects of being utilised in achieving a higher control bandwidth without opting for very high uniform sample rates. The concept, to the author s knowledge, has not formally been studied and very few definite answers exist in control literature regarding the associated analysis techniques. The key contributions adduced in this thesis include the development and analysis of the control algorithm designed to accommodate intentional non-uniform sample frequencies. In addition, the implementation aspects are presented on an 8-bit microcontroller and an FPGA board. This work begins by establishing a brief historical perspective on the use of non-uniform sampling and its role for digital processing. The study is then applied to the problem of digital control design, and applications are further discoursed. This is followed by consideration of its implementation aspects on standard hardware.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Developing a Holonomic iROV as a Tool for Kelp Bed Mapping

    Get PDF

    Watermarking techniques for genuine fingerprint authentication.

    Get PDF
    Fingerprints have been used to authenticate people remotely and allow them access to a system. However, the fingerprint-capture sensor is cracked easily using false fingerprint features constructed from a glass surface. Fake fingerprints, which can be easily obtained by attackers, could cheat the system and this issue remains a challenge in fingerprint-based authentication systems. Thus, a mechanism that can validate the originality of fingerprint samples is desired. Watermarking techniques have been used to enhance the fingerprint-based authentication process, however, none of them have been found to satisfy genuine person verification requirements. This thesis focuses on improving the verification of the genuine fingerprint owner using watermarking techniques. Four research issues are being addressed to achieve the main aim of this thesis. The first research task was to embed watermark into fingerprint images collected from different angles. In verification systems, an acquired fingerprint image is compared with another image, which was stored in the database at the time of enrolment. The displacements and rotations of fingerprint images collected from different angles lead to different sets of minutiae. In this case, the fingerprint-based authentication system operates on the ‘close enough’ matching principle between samples and template. A rejection of genuine samples can occur erroneously in such cases. The process of embedding watermarks into fingerprint samples could make this worse by adding spurious minutiae or corrupting correct minutiae. Therefore, a watermarking method for fingerprint images collected from different angles is proposed. Second, embedding high payload of watermark into fingerprint image and preserving the features of the fingerprint from being affected by the embedded watermark is challenging. In this scenario, embedding multiple watermarks that can be used with fingerprint to authenticate the person is proposed. In the developed multi-watermarks schema, two watermark images of high payloads are embedded into fingerprints without significantly affecting minutiae. Third, the robustness of the watermarking approach against image processing operations is important. The implemented fingerprint watermarking algorithms have been proposed to verify the origin of the fingerprint image; however, they are vulnerable to several modes of image operations that can affect the security level of the authentication system. The embedded watermarks, and the fingerprint features that are used subsequently for authentication purposes, can be damaged. Therefore, the current study has evaluated in detail the robustness of the proposed watermarking methods to the most common image operations. Fourth, mobile biometrics are expected to link the genuine user to a claimed identity in ubiquitous applications, which is a great challenge. Touch-based sensors for capturing fingerprints have been incorporated into mobile phones for user identity authentication. However, an individual fake fingerprint cracking the sensor on the iPhone 5S is a warning that biometrics are only a representation of a person, and are not secure. To make thing worse, the ubiquity of mobile devices leaves much room for adversaries to clone, impersonate or fabricate fake biometric identities and/or mobile devices to defraud systems. Therefore, the integration of multiple identifiers for both the capturing device and its owner into one unique entity is proposed

    A survey of the application of soft computing to investment and financial trading

    Get PDF
    corecore