25 research outputs found

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    Statistical Quality Control with the qcr Package

    Get PDF
    [Abstract] The R package qcr for Statistical Quality Control (SQC) is introduced and described. It includes a comprehensive set of univariate and multivariate SQC tools that completes and increases the SQC techniques available in R. Apart from integrating different R packages devoted to SQC (qcc, MSQC), qcr provides nonparametric tools that are highly useful when Gaussian assumption is not met. This package computes standard univariate control charts for individual measurements, (Formula presented), S, R, p, np, c, u, EWMA, and CUSUM. In addition, it includes functions to perform multivariate control charts such as Hotelling T2, MEWMA and MCUSUM. As representative features, multivariate nonparametric alternatives based on data depth are implemented in this package: r, Q and S control charts. The qcr library also estimates the most complete set of capability indices from first to the fourth generation, covering the nonparametric alternatives, and performing the corresponding capability analysis graphical outputs, including the process capability plots. Moreover, Phase I and II control charts for functional data are included.The work of Salvador Naya, Javier Tarrío-Saavedra, Miguel Flores and Rubén Fernández-Casal has been supported by MINECO grant MTM2017-82724-R, and by the Xunta de Galicia (Grupos de Referencia Competitiva ED431C-2020-14 and Centro de Investigación del Sistema universitario de Galicia ED431G 2019/01), all of them through the ERDF. The research of Miguel Flores has been partially supported by Grant PII-DM-002-2016 of Escuela Politécnica Nacional of Ecuador. In addition, the research of Javier Tarrío-Saavedra has been also founded by the eCOAR project (PC18/03) of CITICXunta de Galicia; ED431C-2020-14Xunta de Galicia; ED431G 2019/01Escuela Politécnica Nacional de Ecuador; PII-DM-002-201

    Capability Testing Based on C pm with Multiple Samples

    Get PDF
    Numerous process capability indices have been proposed in the manufacturing industry to provide unitless measures on process performance, which are effective tools for quality improvement and assurance. Most existing methods for capability testing are based on the distribution frequency approaches. Recently, Bayesian approaches have been proposed for testing capability indices C p and C pm but restricted to cases with one single sample. In this paper, we consider estimating and testing capability index C pm based on multiple samples. We propose accordingly a Bayesian procedure for testing C pm . Based on the Bayesian procedure, we develop a simple but practical procedure for practitioners to use in determining whether their manufacturing processes are capable of reproducing products satisfying the preset capability requirement. A process is capable if all the points in the credible interval are greater than the pre-specified capability level. To make the proposed Bayesian approach practical for in-plant applications, we tabulate the minimum values of C * (p) for which the posterior probability p reaches various desirable confidence levels

    Statistical process control by quantile approach.

    Get PDF
    Most quality control and quality improvement procedures involve making assumptions about the distributional form of data it uses; usually that the data is normally distributed. It is common place to find processes that generate data which is non-normally distributed, e.g. Weibull, logistic or mixture data is increasingly encountered. Any method that seeks to avoid the use of transformation for non-normal data requires techniques for identification of the appropriate distributions. In cases where the appropriate distributions are known it is often intractable to implement.This research is concerned with statistical process control (SPC), where SPC can be apply for variable and attribute data. The objective of SPC is to control a process in an ideal situation with respect to a particular product specification. One of the several measurement tools of SPC is control chart. This research is mainly concerned with control chart which monitors process and quality improvement. We believe, it is a useful process monitoring technique when a source of variability is present. Here, control charts provides a signal that the process must be investigated. In general, Shewhart control charts assume that the data follows normal distribution. Hence, most of SPC techniques have been derived and constructed using the concept of quality which depends on normal distribution. In reality, often the set of data such as, chemical process data and lifetimes data, etc. are not normal. So when a control chart is constructed for x or R, assuming that the data is normal, if in reality, the data is nonnormal, then it will provide an inaccurate results.Schilling and Nelson has (1976) investigated under the central limit theory, the effect of non-normality on charts and concluded that the non-normality is usually not a problem for subgroup sizes of four or more. However, for smaller subgroup sizes, and especially for individual measurements, non-normality can be serious problem.The literature review indicates that there are real problems in dealing with statistical process control for non-normal distributions and mixture distributions. This thesis provides a quantile approach to deal with non-normal distributions, in order to construct median rankit control chart. Here, the quantile approach will also be used to calculate process capability index, average run length (ARL), multivariate control chart and control chart for mixture distribution for non-normal situations. This methodology can be easily adopted by the practitioner of statistical process control

    Combining Capability Indices and Control Charts in the Process and Analytical Method Control Strategy

    Get PDF
    Different control charts in combination with the process capability indices, Cp, Cpm and Cpk, as part of the control strategy, were evaluated, since both are key elements in determining whether the method or process is reliable for its purpose. All these aspects were analyzed using real data from unitary processes and analytical methods. The traditional x-chart and moving range chart confirmed both analytical method and process are in control and stable and therefore, the process capability indices can be computed. We applied different criteria to establish the specification limits (i.e., analyst/customer requirements) for fixed method or process performance (i.e., process or method requirements). The unitary process does not satisfy the minimum capability requirements for Cp and Cpk indices when the specification limit and control limits are equal in breath. Therefore, the process needs to be revised; especially, a greater control in the process variation is necessary. For the analytical method, the Cpm and Cpk indices were computed. The obtained results were similar in both cases. For example, if the specification limits are set at ±3% of the target value, the method is considered “satisfactory” (1.22<Cpm<1.50) and no further stringent precision control is required

    Development and application of process capability indices

    Get PDF
    In order to measure the performance of manufacturing processes, several process capability indices have been proposed. A process capability index (PCI) is a unitless number used to measure the ability of a process to continuously produce products that meet customer specifications. These indices have since helped practitioners understand and improve their production systems, but no single index can fully measure the performance of any observed process. Each index has its own drawbacks which can be complemented by using others. Advantages of commonly used indices in assessing different aspects of process performance have been highlighted. Quality cost is also a function of shift in mean, shift in variance and shift in yield. A hybrid is developed that complements the strengths of these individual indices and provides the set containing the smallest number of indices that gives the practitioner detailed information on the shift in mean or variance, the location of mean, yield and potential capability. It is validated that while no single index can fully assess and measure the performance of a univariate normal process, the optimal set of indices selected by the proposed hybrid can simultaneously provide precise information on the shift in mean or variance, the location of mean, yield and potential capability. A simulation study increased the process variability by 100% and then reduced by 50%. The optimal set managed to pick such a shift. The asymmetric ratio was able to detect both the 10% decrease and 20% increase in &amp;micro; but did not alter significantly with a 50% decrease or a 100% increase in &amp;sigma;, which meant it was not sensitive to any shift in &amp;sigma;. The implementation of the hybrid provides the quality practitioner, or computer-aided manufacturing system, with a guideline on prioritised tasks needed to improve the process capability and reduce the cost of poor quality. The author extended the proposed hybrids to fully measure the performance of a process with multiple quality characteristics, which follow normal distribution and are correlated. Furthermore, for multivariate normal processes with correlated quality characteristics, process capability analysis is not complete without fault diagnostics. Fault diagnostics is the identification and ranking of quality characteristics responsible for multivariate process poor performance. Quality practitioners desire to identify and rank quality characteristics, responsible for poor performance, in order to prioritise resources for process quality improvement tasks thereby speeding up the process and minimising quality costs. To date, none of the existing commonly used source identification approaches can classify whether the process behaviour is caused by the shift in mean or change in variance. The author has proposed a source identification algorithm based on mean and variance impact factors to address this shortcoming. Furthermore, the author developed a novel fault diagnostic hybrid based on the proposed optimal set selection algorithm, principal component analysis, machine learning, and the proposed impact-factor. The novelty of this hybrid is that it can carry out a full multivariate process capability analysis and provides a robust tool to precisely identify and rank quality characteristics responsible for the shifts in mean, variance and yield. The fault diagnostic hybrid can guide the practitioners to identify and prioritise quality characteristics responsible for the poor process performance, thereby reducing the quality cost by effectively speeding up the multivariate process improvement tasks. Simulated scenarios have been generated to increase/decrease some components of the mean vector (&amp;micro;2/&amp;micro;4) and in increase/reduce the variability of some components (&amp;sigma;1 reduced to close to zero/&amp;sigma;6 multiplied by 100%). The hybrid ranked X2 and X6 as the most contributing variables to the process poor performance and X1 and X4 as the major contributors to process yield. There is a great challenge in carrying out process capability analysis and fault diagnostics on a high dimensional multivariate non-normal process, with multiple correlated quality characteristics, in a timely manner. The author has developed a multivariate non-normal fault diagnostic hybrid capable of assessing performance and perform fault diagnostics on multivariate non-normal processes. The proposed hybrid first utilizes the Geometric Distance (GD) approach, to reduce dimensionality of the correlated data into fewer number of independent GD variables which can be assessed using univariate process capability indices. This is followed by fitting Burr XII distribution to independent GD variables. The independent fitted distributions are used to estimate both yield and multivariate process capability in a time efficient way. Finally, machine learning approach, is deployed to carry out the task of fault diagnostic by identifying and ranking the correlated quality characteristics responsible for the poor performance of the least performing GD variable. The results show that the proposed hybrid is robust in estimating both yield and multivariate process capability, carrying out fault diagnostics beyond GD variables, and identifying the original characteristic responsible for poor performance. The novelty of the proposed non-normal fault diagnostic hybrid is that it considers quality characteristics related to the least performing GD variable, instead of investigating all the quality characteristics of the multivariate non-normal process. The efficacy of the proposed hybrid is assessed through a real manufacturing examples and simulated scenarios. Variables X1,, X2 and X3 shifted away from the target by 25%, 15% and 35%, respectively, and the hybrid was able to select variables X3 to be contributing the most to the corresponding geometric distance variable&#039;s poor performance

    Method of Discrete Orthogonal Basis Restoration

    Get PDF
    A method is described for utilizing discrete orthogonal basis to restore signal system, such as radio or sound waves and/or image system such as photographs or medical images that become distorted while being acquired, transmitted and/or received. The signal or image systems are of the linear type and may be represented by the equation [B] [o]= [i] wherein [o] is an original signal or image, [i] is a degraded signal or image and [B] is a system transfer function matrix. The method involves estimating a signal-to-noise ratio for a restored signal or image. Next, is the selecting of a set of orthogonal basis set functions to provide a stable inverse solution based upon the estimated signal-to-noise ratio. This is followed by removing time and/or spatially varying distortions in the restored system and obtaining an appropriate inverse solution vector

    Modeling and Optimization of Stochastic Process Parameters in Complex Engineering Systems

    Get PDF
    For quality engineering researchers and practitioners, a wide number of statistical tools and techniques are available for use in the manufacturing industry. The objective or goal in applying these tools has always been to improve or optimize a product or process in terms of efficiency, production cost, or product quality. While tremendous progress has been made in the design of quality optimization models, there remains a significant gap between existing research and the needs of the industrial community. Contemporary manufacturing processes are inherently more complex - they may involve multiple stages of production or require the assessment of multiple quality characteristics. New and emerging fields, such as nanoelectronics and molecular biometrics, demand increased degrees of precision and estimation, that which is not attainable with current tools and measures. And since most researchers will focus on a specific type of characteristic or a given set of conditions, there are many critical industrial processes for which models are not applicable. Thus, the objective of this research is to improve existing techniques by not only expanding their range of applicability, but also their ability to more realistically model a given process. Several quality models are proposed that seek greater precision in the estimation of the process parameters and the removal of assumptions that limit their breadth and scope. An extension is made to examine the effectiveness of these models in both non-standard conditions and in areas that have not been previously investigated. Upon the completion of an in-depth literature review, various quality models are proposed, and numerical examples are used to validate the use of these methodologies

    Quality prediction in manufacturing system design.

    Get PDF
    Manufacturing system design can significantly affect the resulting product quality level. Therefore, the early prediction of product quality, as affected by manufacturing system configuration decisions, can enhance the manufacturer\u27s competitiveness through achieving higher quality levels at lower costs in a responsive manner. In this research, a conceptual framework is proposed for the proactive assessment of product quality in terms of the manufacturing system configuration parameters. A new comprehensive model that can be used in comparing different system configurations based on quality is developed using Analytic Hierarchy Process. In addition, a hierarchical fuzzy inference system is developed to model the ill-defined relation between manufacturing system design parameters and the resulting product quality. This model is capable of mapping the considered manufacturing system configuration parameters into a Configuration Capability Indicator (CCI), expressed in terms of sigma capability level, which can be compared to the benchmark Six Sigma capability. The developed models have been applied to several case studies (Test Parts ANC-90 and ANC-101, Cylinder Head Part Family, Gearbox Housing, Rack Bar Machining, and Siemens Jeep Intake Manifold) with different configuration scenarios for illustration and verification. The results demonstrate the capabilities of the CCI in comparing different system configurations from quality point of view and in supporting the decision-making during the early stages of manufacturing system development. The included application of the developed models emphasized that high quality levels can be achieved by investigating all the improvement opportunities and it is recommended that efforts should be directed in the first place to design the system with high defect prevention capability. This can be achieved by using highly capable processes, implementation of mistake proofing techniques, as well as minimizing variability due to parallel processing and variation stack up. Considering the relationship between quality and complexity, it has been concluded that the CCI represents the time-independent real complexity of a system configuration. Furthermore, it has been demonstrated that the product complexity adversely affects the resulting product quality. Therefore, it is recommended that high product quality levels can be achieved not only by using highly capable system configurations, but also by minimizing the product complexity during the design stage.Dept. of Industrial and Manufacturing Systems Engineering. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2006 .N33. Source: Dissertation Abstracts International, Volume: 67-07, Section: B, page: 4035. Thesis (Ph.D.)--University of Windsor (Canada), 2006
    corecore