833 research outputs found

    Design and Application of Risk Adjusted Cumulative Sum (RACUSUM) for Online Strength Monitoring of Ready Mixed Concrete

    Get PDF
    The Cumulative Sum (CUSUM) procedure is an effective statistical process control tool that can be used to monitor quality of ready mixed concrete (RMC) during its production process. Online quality monitoring refers to monitoring of the concrete quality at the RMC plant during its production process. In this paper, we attempt to design and apply a new CUSUM procedure for RMC industry which takes care of the risks involved and associated with the production of RMC. This new procedure can be termed as Risk Adjusted CUSUM (RACUSUM). The 28 days characteristic cube compressive strengths of the various grades of concrete and detailed information regarding the production process and the risks associated with the production of RMC were collected from the operational RMC plants in and around Ahmedabad and Delhi (India). The risks are quantified using a likelihood based scoring method. Finally a Risk Adjusted CUSUM model is developed by imposing the weighted score of the estimated risks on the conventional CUSUM plot. This model is a more effective and realistic tool for monitoring the strength of RMC.

    A Binary Control Chart to Detect Small Jumps

    Full text link
    The classic N p chart gives a signal if the number of successes in a sequence of inde- pendent binary variables exceeds a control limit. Motivated by engineering applications in industrial image processing and, to some extent, financial statistics, we study a simple modification of this chart, which uses only the most recent observations. Our aim is to construct a control chart for detecting a shift of an unknown size, allowing for an unknown distribution of the error terms. Simulation studies indicate that the proposed chart is su- perior in terms of out-of-control average run length, when one is interest in the detection of very small shifts. We provide a (functional) central limit theorem under a change-point model with local alternatives which explains that unexpected and interesting behavior. Since real observations are often not independent, the question arises whether these re- sults still hold true for the dependent case. Indeed, our asymptotic results work under the fairly general condition that the observations form a martingale difference array. This enlarges the applicability of our results considerably, firstly, to a large class time series models, and, secondly, to locally dependent image data, as we demonstrate by an example

    Improving Sensitivity of the DEWMA Chart with Exact ARL Solution under the Trend AR(p) Model and Its Applications

    Get PDF
    The double exponentially weighted moving average (DEWMA) chart is a control chart that is a vital analytical tool for keeping track of the quality of a process, and the sensitivity of the control chart to the process is evaluated using the average run length (ARL). Herein, the aim of this study is to derive the explicit formula of the ARL on the DEWMA chart with the autoregressive with trend model and its residual, which is exponential white noise. This study shows that this proposed method was compared to the ARL derived using the numerical integral equation (NIE) approach, and the explicit ARL formula decreased the computing time. By changing exponential parameters that were relevant to evaluating in various circumstances, the sensitivity of AR(p) with the trend model with the DEWMA chart was investigated. These were compared with the EWMA and CUSUM charts in terms of the ARL, standard deviation run length (SDRL), and median run length (MRL). The results indicate that the DEWMA chart has the highest performance, and when it was small, the DEWMA chart had high sensitivity for detecting processes. Digital currencies are utilized to demonstrate the efficacy of the proposed method; the results are consistent with the simulated data. Doi: 10.28991/ESJ-2023-07-06-03 Full Text: PD

    Integrating Multiobjective Optimization With The Six Sigma Methodology For Online Process Control

    Get PDF
    Over the past two decades, the Define-Measure-Analyze-Improve-Control (DMAIC) framework of the Six Sigma methodology and a host of statistical tools have been brought to bear on process improvement efforts in today’s businesses. However, a major challenge of implementing the Six Sigma methodology is maintaining the process improvements and providing real-time performance feedback and control after solutions are implemented, especially in the presence of multiple process performance objectives. The consideration of a multiplicity of objectives in business and process improvement is commonplace and, quite frankly, necessary. However, balancing the collection of objectives is challenging as the objectives are inextricably linked, and, oftentimes, in conflict. Previous studies have reported varied success in enhancing the Six Sigma methodology by integrating optimization methods in order to reduce variability. These studies focus these enhancements primarily within the Improve phase of the Six Sigma methodology, optimizing a single objective. The current research and practice of using the Six Sigma methodology and optimization methods do little to address the real-time feedback and control for online process control in the case of multiple objectives. This research proposes an innovative integrated Six Sigma multiobjective optimization (SSMO) approach for online process control. It integrates the Six Sigma DMAIC framework with a nature-inspired optimization procedure that iteratively perturbs a set of decision variables providing feedback to the online process, eventually converging to a set of tradeoff process configurations that improves and maintains process stability. For proof of concept, the approach is applied to a general business process model – a well-known inventory management model – that is formally defined and specifies various process costs as objective functions. The proposed iv SSMO approach and the business process model are programmed and incorporated into a software platform. Computational experiments are performed using both three sigma (3σ)-based and six sigma (6σ)-based process control, and the results reveal that the proposed SSMO approach performs far better than the traditional approaches in improving the stability of the process. This research investigation shows that the benefits of enhancing the Six Sigma method for multiobjective optimization and for online process control are immense

    Time-Interval Analysis for Radiation Monitoring

    Get PDF
    On-line radiation monitoring is essential to the U.S. Department of Energy (DOE) Environmental Management Science Program for assessing the impact of contaminated media at DOE sites. The goal of on-line radiation monitoring is to quickly detect small or abrupt changes in activity levels in the presence of a significant ambient background. The focus of this research is on developing effective statistical algorithms to meet the goal of on-line monitoring based on time-interval (time-difference between two consecutive radiation pulses) data. Compared to the more commonly used count data which are registered in a fixed count time, time-interval data possess the potential to reduce the sampling time required to obtain statistically sufficient information to detect changes in radiation levels. This dissertation has been formulated into three sections based on three statistical methods: sequential probability ratio test (SPRT), Bayesian statistics, and cumulative sum (CUSUM) control chart. In each section, time-interval analysis based on one of the three statistical methods was investigated and compared to conventional analyses based on count data in terms of average run length (ARL or average time to detect a change in radiation levels) and detection probability with both experimental and simulated data. The experimental data were acquired with a DGF-4C (XIA, Inc) system in list mode. Simulated data were obtained by using Monte Carlo techniques to obtain a random sampling of a Poisson process. Statistical algorithms were developed using the statistical software package R and the programming function built in the data analysis environment IGOR Pro. 4.03. Overall, the results showed that the statistical analyses based on time-interval data provided similar or higher detection probabilities relative to other statistical analyses based on count data, but were able to make a quicker detection with fewer pulses at relatively higher radiation levels. To increase the detection probability and further reduce the time needed to detect a change in radiation levels for time-interval analyses, modifications or adjustments were proposed for each of the three chosen statistical methods. Parameter adjustment to the preset background level in the SPRT test could reduce the average time to detect a source by 50%. Enhanced reset modification and moving prior modification proposed for the Bayesian analysis of time-intervals resulted in a higher detection probability than the Bayesian analysis without modifications, and were independent of the amount of background data registered before a radioactive source was present. The robust CUSUM control chart coupled with a modified runs rule showed the ability to further reduce the ARL to respond to changes in radiation levels, and keep the false positive rate at a required level, e.g., about 40% shorter than the standard time-interval CUSUM control chart at 10.0cps relative to a background count rate of 2.0cps. The developed statistical algorithms for time-interval data analyses demonstrate the feasibility and versatility for on-line radiation monitoring. The special properties of time-interval information provide an alternative for low-level radiation monitoring. These findings establish an important base for future on-line monitoring applications when time-interval data are registered

    ARL Evaluation of a DEWMA Control Chart for Autocorrelated Data: A Case Study on Prices of Major Industrial Commodities

    Get PDF
    The double exponentially weighted moving average (DEWMA) control chart, an extension of the EWMA control chart, is a useful statistical process control tool for detecting small shift sizes in the mean of processes with either independent or autocorrelated observations. In this study, we derived explicit formulas to compute the average run length (ARL) for a moving average of order q (MA(q)) process with exponential white noise running on a DEWMA control chart and verified their accuracy by comparison with the numerical integral equation (NIE) method. The results for both were in good agreement with the actual ARL. To investigate the efficiency of the proposed procedure on the DEWMA control chart, a performance comparison between it and the standard and modified EWMA control charts was also conducted to determine which provided the smallest out-of-control ARL value for several scenarios involving MA(q) processes. It was found that the DEWMA control chart provided the lowest out-of-control ARL for all cases of varying the exponential smoothing parameter and shift size values. To illustrate the efficacy of the proposed methodology, the presented approach was applied to datasets of the prices of several major industrial commodities in Thailand. The findings show that the DEWMA procedure performed well in almost all of the scenarios tested. Doi: 10.28991/ESJ-2023-07-05-020 Full Text: PD
    corecore