29,634 research outputs found

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Damage identification in structural health monitoring: a brief review from its implementation to the Use of data-driven applications

    Get PDF
    The damage identification process provides relevant information about the current state of a structure under inspection, and it can be approached from two different points of view. The first approach uses data-driven algorithms, which are usually associated with the collection of data using sensors. Data are subsequently processed and analyzed. The second approach uses models to analyze information about the structure. In the latter case, the overall performance of the approach is associated with the accuracy of the model and the information that is used to define it. Although both approaches are widely used, data-driven algorithms are preferred in most cases because they afford the ability to analyze data acquired from sensors and to provide a real-time solution for decision making; however, these approaches involve high-performance processors due to the high computational cost. As a contribution to the researchers working with data-driven algorithms and applications, this work presents a brief review of data-driven algorithms for damage identification in structural health-monitoring applications. This review covers damage detection, localization, classification, extension, and prognosis, as well as the development of smart structures. The literature is systematically reviewed according to the natural steps of a structural health-monitoring system. This review also includes information on the types of sensors used as well as on the development of data-driven algorithms for damage identification.Peer ReviewedPostprint (published version

    Understanding Learned Models by Identifying Important Features at the Right Resolution

    Full text link
    In many application domains, it is important to characterize how complex learned models make their decisions across the distribution of instances. One way to do this is to identify the features and interactions among them that contribute to a model's predictive accuracy. We present a model-agnostic approach to this task that makes the following specific contributions. Our approach (i) tests feature groups, in addition to base features, and tries to determine the level of resolution at which important features can be determined, (ii) uses hypothesis testing to rigorously assess the effect of each feature on the model's loss, (iii) employs a hierarchical approach to control the false discovery rate when testing feature groups and individual base features for importance, and (iv) uses hypothesis testing to identify important interactions among features and feature groups. We evaluate our approach by analyzing random forest and LSTM neural network models learned in two challenging biomedical applications.Comment: First two authors contributed equally to this work, Accepted for presentation at the Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19

    Systems biology in inflammatory bowel diseases

    Get PDF
    Purpose of review: Ulcerative colitis (UC) and Crohn’s Disease (CD) are the two predominant types of inflammatory bowel disease (IBD), affecting over 1.4 million individuals in the US. IBD results from complex interactions between pathogenic components, including genetic and epigenetic factors, the immune response and the microbiome through an unknown sequence of events. The purpose of this review is to describe a system biology approach to IBD as a novel and exciting methodology aiming at developing novel IBD therapeutics based on the integration of molecular and cellular "omics" data. Recent Findings: Recent evidence suggested the presence of genetic, epigenetic, transcriptomic, proteomic and metabolomic alterations in IBD patients. Furthermore, several studies have shown that different cell types, including fibroblasts, epithelial, immune and endothelial cells together with the intestinal microbiota are involved in IBD pathogenesis. Novel computational methodologies have been developed aiming to integrate high - throughput molecular data. Summary: A systems biology approach could potentially identify the central regulators (hubs) in the IBD interactome and improve our understanding of the molecular mechanisms involved in IBD pathogenesis. The future IBD therapeutics should be developed on the basis of targeting the central hubs in the IBD network

    Tuberculosis diagnostics and biomarkers: needs, challenges, recent advances, and opportunities

    Get PDF
    Tuberculosis is unique among the major infectious diseases in that it lacks accurate rapid point-of-care diagnostic tests. Failure to control the spread of tuberculosis is largely due to our inability to detect and treat all infectious cases of pulmonary tuberculosis in a timely fashion, allowing continued Mycobacterium tuberculosis transmission within communities. Currently recommended gold-standard diagnostic tests for tuberculosis are laboratory based, and multiple investigations may be necessary over a period of weeks or months before a diagnosis is made. Several new diagnostic tests have recently become available for detecting active tuberculosis disease, screening for latent M. tuberculosis infection, and identifying drug-resistant strains of M. tuberculosis. However, progress toward a robust point-of-care test has been limited, and novel biomarker discovery remains challenging. In the absence of effective prevention strategies, high rates of early case detection and subsequent cure are required for global tuberculosis control. Early case detection is dependent on test accuracy, accessibility, cost, and complexity, but also depends on the political will and funder investment to deliver optimal, sustainable care to those worst affected by the tuberculosis and human immunodeficiency virus epidemics. This review highlights unanswered questions, challenges, recent advances, unresolved operational and technical issues, needs, and opportunities related to tuberculosis diagnostics
    corecore