348 research outputs found

    Resilient Adaptive Control of Uncertain Time-Delay Systems

    Get PDF

    Optimal reference sequence selection for genome assembly using minimum description length principle

    Get PDF
    Reference assisted assembly requires the use of a reference sequence, as a model, to assist in the assembly of the novel genome. The standard method for identifying the best reference sequence for the assembly of a novel genome aims at counting the number of reads that align to the reference sequence, and then choosing the reference sequence which has the highest number of reads aligning to it. This article explores the use of minimum description length (MDL) principle and its two variants, the two-part MDL and Sophisticated MDL, in identifying the optimal reference sequence for genome assembly. The article compares the MDL based proposed scheme with the standard method coming to the conclusion that “counting the number of reads of the novel genome present in the reference sequence” is not a sufficient condition. Therefore, the proposed MDL scheme includes within itself the standard method of “counting the number of reads that align to the reference sequence” and also moves forward towards looking at the model, the reference sequence, as well, in identifying the optimal reference sequence. The proposed MDL based scheme not only becomes the sufficient criterion for identifying the optimal reference sequence for genome assembly but also improves the reference sequence so that it becomes more suitable for the assembly of the novel genome

    A novel leak detection approach in water distribution networks

    Get PDF
    © 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksThis paper proposes a novel leak monitoring framework aims to improve the operation of water distribution network (WDN). To do that, an online statistical hypothesis test based leak detection is proposed. The main advantages of the developed method are first to deal with the higher required computational time for detecting leaks and then, to update the KPCA model according to the dynamic change of the process. Thus, this can be performed to massive and online datasets. Simulation results obtained from simulated WDN data demonstrate the effectiveness of the proposed technique.Peer ReviewedPostprint (author's final draft

    Fault Detection of Single and Interval Valued Data Using Statistical Process Monitoring Techniques

    Get PDF
    Principal component analysis (PCA) is a linear data analysis technique widely used for fault detection and isolation, data modeling, and noise filtration. PCA may be combined with statistical hypothesis testing methods, such as the generalized likelihood ratio (GLR) technique in order to detect faults. GLR functions by using the concept of maximum likelihood estimation (MLE) in order to maximize the detection rate for a fixed false alarm rate. The benchmark Tennessee Eastman Process (TEP) is used to examine the performance of the different techniques, and the results show that for processes that experience both shifts in the mean and/or variance, the best performance is achieved by independently monitoring the mean and variance using two separate GLR charts, rather than simultaneously monitoring them using a single chart. Moreover, single-valued data can be aggregated into interval form in order to provide a more robust model with improved fault detection performance using PCA and GLR. The TEP example is used once more in order to demonstrate the effectiveness of using of interval-valued data over single-valued data

    Process Monitoring Using Data-Based Fault Detection Techniques: Comparative Studies

    Get PDF
    Data based monitoring methods are often utilized to carry out fault detection (FD) when process models may not necessarily be available. The partial least square (PLS) and principle component analysis (PCA) are two basic types of multivariate FD methods, however, both of them can only be used to monitor linear processes. Among these extended data based methods, the kernel PCA (KPCA) and kernel PLS (KPLS) are the most well-known and widely adopted. KPCA and KPLS models have several advantages, since, they do not require nonlinear optimization, and only the solution of an eigenvalue problem is required. Also, they provide a better understanding of what kind of nonlinear features are extracted: the number of the principal components (PCs) in a feature space is fixed a priori by selecting the appropriate kernel function. Therefore, the objective of this work is to use KPCA and KPLS techniques to monitor nonlinear data. The improved FD performance of KPCA and KPLS is illustrated through two simulated examples, one using synthetic data and the other using simulated continuously stirred tank reactor (CSTR) data. The results demonstrate that both KPCA and KPLS methods are able to provide better detection compared to the linear versions

    Online statistical hypothesis test for leak detection in water distribution networks

    Get PDF
    This paper aims at improving the operation of the water distribution networks (WDN) by developing a leak monitoring framework. To do that, an online statistical hypothesis test based on leak detection is proposed. The developed technique, the so-called exponentially weighted online reduced kernel generalized likelihood ratio test (EW-ORKGLRT), is addressed so that the modeling phase is performed using the reduced kernel principal component analysis (KPCA) model, which is capable of dealing with the higher computational cost. Then the computed model is fed to EW-ORKGLRT chart for leak detection purposes. The proposed approach extends the ORKGLRT method to the one that uses exponential weights for the residuals in the moving window. It might be able to further enhance leak detection performance by detecting small and moderate leaks. The developed method’s main advantages are first dealing with the higher required computational time for detecting leaks and then updating the KPCA model according to the dynamic change of the process. The developed method’s performance is evaluated and compared to the conventional techniques using simulated WDN data. The selected performance criteria are the excellent detection rate, false alarm rate, and CPU time.Peer ReviewedPostprint (author's final draft

    Gauge theories: A case study of how mathematics relates to the world.

    Get PDF
    The goal of this thesis is to investigate the relation between mathematics and physics and the role this relation plays in what physics does best, that is in scientific explanations. The case of gauge theories, which are highly mathematical, is used as an extended case study of how mathematics relates to physics and to the world and these relations are examined from both a historical and a philosophical perspective. Gauge theories originated from an idea of Weyl which turned out to be wrong, or in other words, empirically inadequate. That original idea underwent a dramatic metamorphosis that turned the awkward caterpillar into a beautiful butterfly called gauge theories, which were very successful and dominated theoretical physics during the second half of the twentieth century. The only leftover from Weyl's faux pas was the very name of the theories and the question how it is possible for something as wrong as his original idea to result in a theory so relevant to the world. We argue that it is thanks to a very dynamic and dialectic relation between mathematicians and physicists, both theoretical and experimental, that the resulting theory turned out to be so successful. From a more philosophical perspective, we take the view that the relation between mathematics and physics has a structuralist character, in general, and we recognize that what we call ambiguity of representation of the third type lies at the heart of gauge theories. Our claim is that it is precisely this type of ambiguity of representation and the non-physical entities that it inevitably introduces which ex-plain the physical facts. However, the non-physical entities should be attributed a non-causal status in order to provide valid and legitimate scientific explanations. The fibre bundles formulation of gauge theories is considered to be their unique formulation that allows for this shift and the Aharonov-Bohm effect which is examined within the fibre bundle context provides a narrower yet very fruitful case study
    • …
    corecore