5 research outputs found

    Improved Shewhart Chart Using Multiscale Representation

    Get PDF
    Most univariate process monitoring techniques operate under three main assumptions, that the process residuals being evaluated are Gaussian, independent and contain a moderate level of noise. The performance of the conventional Shewhart chart, for example, is adversely affected when these assumptions are violated. Multiscale wavelet-based representation is a powerful data analysis tool that can help better satisfy these assumptions, i.e., decorrelate autocorrelated data, separate noise from features, and transform the data to better follow a Gaussian distribution at multiple scales. This research focused on developing an algorithm to extend the conventional Shewhart chart using multiscale representation to enhance its performance. Through simulated synthetic data, the developed multiscale Shewhart chart showed improved performance (with lower missed detection and false alarm rates) than the conventional Shewhart chart. The developed multiscale Shewhart chart was also applied to two real world applications, simulated distillation column data, and genomic copy number data, to illustrate the advantage of using the multiscale Shewhart chart for process monitoring over the conventional one

    Fault Detection of Single and Interval Valued Data Using Statistical Process Monitoring Techniques

    Get PDF
    Principal component analysis (PCA) is a linear data analysis technique widely used for fault detection and isolation, data modeling, and noise filtration. PCA may be combined with statistical hypothesis testing methods, such as the generalized likelihood ratio (GLR) technique in order to detect faults. GLR functions by using the concept of maximum likelihood estimation (MLE) in order to maximize the detection rate for a fixed false alarm rate. The benchmark Tennessee Eastman Process (TEP) is used to examine the performance of the different techniques, and the results show that for processes that experience both shifts in the mean and/or variance, the best performance is achieved by independently monitoring the mean and variance using two separate GLR charts, rather than simultaneously monitoring them using a single chart. Moreover, single-valued data can be aggregated into interval form in order to provide a more robust model with improved fault detection performance using PCA and GLR. The TEP example is used once more in order to demonstrate the effectiveness of using of interval-valued data over single-valued data

    Process Monitoring Using Data-Based Fault Detection Techniques: Comparative Studies

    Get PDF
    Data based monitoring methods are often utilized to carry out fault detection (FD) when process models may not necessarily be available. The partial least square (PLS) and principle component analysis (PCA) are two basic types of multivariate FD methods, however, both of them can only be used to monitor linear processes. Among these extended data based methods, the kernel PCA (KPCA) and kernel PLS (KPLS) are the most well-known and widely adopted. KPCA and KPLS models have several advantages, since, they do not require nonlinear optimization, and only the solution of an eigenvalue problem is required. Also, they provide a better understanding of what kind of nonlinear features are extracted: the number of the principal components (PCs) in a feature space is fixed a priori by selecting the appropriate kernel function. Therefore, the objective of this work is to use KPCA and KPLS techniques to monitor nonlinear data. The improved FD performance of KPCA and KPLS is illustrated through two simulated examples, one using synthetic data and the other using simulated continuously stirred tank reactor (CSTR) data. The results demonstrate that both KPCA and KPLS methods are able to provide better detection compared to the linear versions

    Novel Data-Based and Model-Based Algorithms for Process Monitoring and Equipment Degradation Tracking

    No full text
    Process monitoring is a critical component of many industries, required in order to maintain product quality and enhance process safety, thereby increasing economic benefits. Process monitoring methods provide a means of determining if a process is operating as expected, or if it is experiencing faulty or abhorrent conditions, e.g., process drifts or disturbances that disrupt the operation, which can result in plant shutdowns and economic losses due to down time and maintenance. Process monitoring methods can be broadly categorized into qualitative based models, quantitative based models, and data-based models. A primary objective of this work is to enhance the performance of monitoring algorithms by integrating the advantages of various data-driven and model-based methods. Data-based fault detection methods such as principal component analysis (PCA) and its extensions, will be integrated with composite hypothesis tests, such as the generalized likelihood ratio (GLR) charts in order to obtain superior fault detection performance when compared to conventional methods. The applicability of the developed fault detection algorithms will be examined using different illustrative examples, such as the Tennessee Eastman (TE) process. Monitoring process drifts and equipment degradation is another area of concern in process industries. Therefore, a second objective of this work is to develop an algorithm capable of detecting drifts in processes and equipment degradation, even when operating under control, by utilizing state estimation methods that are able to determine when a process is operating under sub-par conditions. The developed algorithm will be applied on an illustrative example of a heat exchanger, using both simulated synthetic and experimental data, to demonstrate its simplicity and practical applicability. This should enable the process engineer to make better executive decisions regarding the running of the plant. Pipeline flow and leak detection, specifically in subsea pipelines is another important issue that needs to be addressed, and therefore a third objective of this work is to design and develop an experimental setup to collect different sensor measurements, and utilize different fault detection and classification algorithms in order to study pipeline flow behavior

    SARS-CoV-2 vaccination modelling for safe surgery to save lives: data from an international prospective cohort study

    No full text
    Background Preoperative SARS-CoV-2 vaccination could support safer elective surgery. Vaccine numbers are limited so this study aimed to inform their prioritization by modelling. Methods The primary outcome was the number needed to vaccinate (NNV) to prevent one COVID-19-related death in 1 year. NNVs were based on postoperative SARS-CoV-2 rates and mortality in an international cohort study (surgical patients), and community SARS-CoV-2 incidence and case fatality data (general population). NNV estimates were stratified by age (18-49, 50-69, 70 or more years) and type of surgery. Best- and worst-case scenarios were used to describe uncertainty. Results NNVs were more favourable in surgical patients than the general population. The most favourable NNVs were in patients aged 70 years or more needing cancer surgery (351; best case 196, worst case 816) or non-cancer surgery (733; best case 407, worst case 1664). Both exceeded the NNV in the general population (1840; best case 1196, worst case 3066). NNVs for surgical patients remained favourable at a range of SARS-CoV-2 incidence rates in sensitivity analysis modelling. Globally, prioritizing preoperative vaccination of patients needing elective surgery ahead of the general population could prevent an additional 58 687 (best case 115 007, worst case 20 177) COVID-19-related deaths in 1 year. Conclusion As global roll out of SARS-CoV-2 vaccination proceeds, patients needing elective surgery should be prioritized ahead of the general population.The aim of this study was to inform vaccination prioritization by modelling the impact of vaccination on elective inpatient surgery. The study found that patients aged at least 70 years needing elective surgery should be prioritized alongside other high-risk groups during early vaccination programmes. Once vaccines are rolled out to younger populations, prioritizing surgical patients is advantageous
    corecore