203 research outputs found

    Cumulative sum quality control charts design and applications

    Get PDF
    Includes bibliographical references (pages 165-169).Classical Statistical Process Control Charts are essential in Statistical Control exercises and thus constantly obtained attention for quality improvements. However, the establishment of control charts requires large-sample data (say, no less than I 000 data points). On the other hand, we notice that the small-sample based Grey System Theory Approach is well-established and applied in many areas: social, economic, industrial, military and scientific research fields. In this research, the short time trend curve in terms of GM( I, I) model will be merged into Shewhart and CU SUM two-sided version control charts and establish Grey Predictive Shewhart Control chart and Grey Predictive CUSUM control chart. On the other hand the GM(2, I) model is briefly checked its of how accurate it could be as compared to GM( I, 1) model in control charts. Industrial process data collected from TBF Packaging Machine Company in Taiwan was analyzed in terms of these new developments as an illustrative example for grey quality control charts

    Studies on chain sampling schemes in quality and reliability engineering

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Determining continual improvement process methods within quality management systems

    Get PDF
    Institutionalized standards require organizations to actively define and implement quality management systems, which includes active participation in continual improvement efforts. Interpretations and practices vary on implementation methodology. Traditional views of quality do not integrate the technical disciplines into a defined science which would support a standardized approach for continual improvement implementation. In order to optimize improvement efforts, a conceptual hypothesis is proposed to integrate quality through combining and collaborating implementation efforts of engineering, control, assurance, improvement and costs. The purpose of this thesis is to establish a roadmap to assist in choosing effective quality improvement methodologies and toolsets that assist in enhancing customer satisfaction, which is desirable as part of a total quality management philosophy. Research is warranted to evaluate the bodies of knowledge into an extended science that establishes standardized practices in the area of quality improvement

    Diagnosis of Machining Conditions Based on Logical Analysis of Data

    Get PDF
    RÉSUMÉ : Un élément clé pour un système d'usinage automatisé sans surveillance est le développement de systèmes de surveillance et de contrôle fiables et robustes. Plusieurs modèles mathématiques et statistiques, qui modélisent la relation entre les variables indépendantes et les variables dépendantes d’usinage, sont suggérés dans la littérature, en commençant par le modèle de Taylor jusqu’aux modèles de régression les plus sophistiqués. Tous ces modèles ne sont pas dynamiques, dans le sens que leurs paramètres ne changent pas avec le temps. Des modèles basés sur l'intelligence artificielle ont résolu de nombreux problèmes dans ce domaine, mais la recherche continue. Dans la présente thèse, je propose l'application d'une approche appelée Analyse Logique de Données (LAD) pour prédire le sortant d’un processus d’usinage. Cette approche a démontré une bonne performance et des capacités additionnelles une fois comparée à la conception traditionnelle des expériences ou à la modélisation mathématique et statistique. Elle est aussi comparée dans cette thèse à la méthode bien connue des réseaux de neurones. Elle est basée sur l'exploitation des données saisies par des capteurs et l'extraction des informations utiles à partir de ces dernières. LAD est utilisé pour déterminer les meilleures conditions d'usinage, pour détecter l'usure de l'outil, pour identifier le moment optimal de remplacement de l’outil d’usinage, et pour surveiller et contrôler les processus d'usinage. Étant donné que les capteurs et les technologies de l'information sont tous les deux en expansion rapide et continue, il serait prévu qu'un outil d’analyse tels que LAD aidera à tracer un chemin dans l'amelioration des processus d'usinage en utilisant les techniques de pointe afin de réduire considérablement le coût ces processus. Les résultats de mon travail pourraient avoir un impact important sur l'optimisation de ces processus.----------ABSTRACT : A key issue for an unattended and automated machining system is the development of reliable and robust monitoring and controlling systems. Research in Artificial Intelligence-based monitoring of machining systems covers several issues and has solved many problems, but the search continues for a robust technique that does not depend on a statistical learning background and that does not have ambiguous procedures. In this thesis, I propose the application of an approach called Logical Analysis of Data (LAD) which is based on the exploitation of data captured by sensors, and the extraction of useful information from this data. LAD is used for determining the best machining conditions, detecting the tool wear, identifying the optimal replacement time for machining tools, monitoring, and controlling machining processes. LAD has demonstrated good performance and additional capabilities when it is compared to the famous statistical technique, Proportional Hazard Model (PHM), and the well known machine learning technique, Artificial Neural Network (ANN). Since sensors’ and information technologies are both expanding rapidly and continuously, it is expected that an analysis tool such as LAD will help in blazing a new trail in machining processes by using state of the art techniques in order to significantly reduce the cost of machining process

    Robust control charts via winsorized and trimmed estimators

    Get PDF
    In process control, cumulative sum (CUSUM), exponentially weighted moving average (EWMA), and synthetic control charts are developed to detect small and moderate shifts. Small shifts which are hard to detect can be costly to the process control if left undetected for a long period. These control charts are not reliable under non-normality as the design structure of the charts is based on the sample mean. Sample mean is sensitive to outliers, a common cause of non-normality. In circumventing the problem, this study applied robust location estimators in the design structure of the control charts, instead of the sample mean. For such purpose, four robust estimators namely 20%-trimmed mean, median, modified one-step M-estimator (MOM), and winsorized MOM (WMOM) were chosen. The proposed charts were tested on several conditions which include sample sizes, shift sizes, and different types of non-normal distributions represented by the g-and-h distribution. Random variates for each distribution were obtained using SAS RANNOR before transforming them to the desired type of distribution. Robustness and detection ability of the charts were gauged through average run length (ARL) via simulation study. Validation of the charts’ performance which was done through real data study, specifically on potential diabetic patients at Universiti Utara Malaysia shows that robust EWMA chart and robust CUSUM chart outperform the standard charts. The findings concur with the results of simulation study. Even though robust synthetic chart is not among the best choice as it cannot detect small shifts as quickly as CUSUM or EWMA, its performance is much better than the standard chart under non-normality. This study reveals that all the proposed robust charts fare better than the standard charts under non-normality, and comparable with the latter under normality. The most robust among the investigated charts are EWMA control charts based on MOM and WMOM. These robust charts can fast detect small shifts regardless of distributional shapes and work well under small sample sizes. These characteristics suit the industrial needs in process monitoring

    Process capability assessment for univariate and multivariate non-normal correlated quality characteristics

    Get PDF
    In today's competitive business and industrial environment, it is becoming more crucial than ever to assess precisely process losses due to non-compliance to customer specifications. To assess these losses, industry is extensively using Process Capability Indices for performance evaluation of their processes. Determination of the performance capability of a stable process using the standard process capability indices such as and requires that the underlying quality characteristics data follow a normal distribution. However it is an undisputed fact that real processes very often produce non-normal quality characteristics data and also these quality characteristics are very often correlated with each other. For such non-normal and correlated multivariate quality characteristics, application of standard capability measures using conventional methods can lead to erroneous results. The research undertaken in this PhD thesis presents several capability assessment methods to estimate more precisely and accurately process performances based on univariate as well as multivariate quality characteristics. The proposed capability assessment methods also take into account the correlation, variance and covariance as well as non-normality issues of the quality characteristics data. A comprehensive review of the existing univariate and multivariate PCI estimations have been provided. We have proposed fitting Burr XII distributions to continuous positively skewed data. The proportion of nonconformance (PNC) for process measurements is then obtained by using Burr XII distribution, rather than through the traditional practice of fitting different distributions to real data. Maximum likelihood method is deployed to improve the accuracy of PCI based on Burr XII distribution. Different numerical methods such as Evolutionary and Simulated Annealing algorithms are deployed to estimate parameters of the fitted Burr XII distribution. We have also introduced new transformation method called Best Root Transformation approach to transform non-normal data to normal data and then apply the traditional PCI method to estimate the proportion of non-conforming data. Another approach which has been introduced in this thesis is to deploy Burr XII cumulative density function for PCI estimation using Cumulative Density Function technique. The proposed approach is in contrast to the approach adopted in the research literature i.e. use of best-fitting density function from known distributions to non-normal data for PCI estimation. The proposed CDF technique has also been extended to estimate process capability for bivariate non-normal quality characteristics data. A new multivariate capability index based on the Generalized Covariance Distance (GCD) is proposed. This novel approach reduces the dimension of multivariate data by transforming correlated variables into univariate ones through a metric function. This approach evaluates process capability for correlated non-normal multivariate quality characteristics. Unlike the Geometric Distance approach, GCD approach takes into account the scaling effect of the variance-covariance matrix and produces a Covariance Distance variable that is based on the Mahanalobis distance. Another novelty introduced in this research is to approximate the distribution of these distances by a Burr XII distribution and then estimate its parameters using numerical search algorithm. It is demonstrates that the proportion of nonconformance (PNC) using proposed method is very close to the actual PNC value

    Process Capability Calculations with Nonnormal Data in the Medical Device Manufacturing Industry

    Get PDF
    U.S. Food and Drug Administration (FDA) recalls of medical devices are at historically high levels despite efforts by manufacturers to meet stringent agency requirements to ensure quality and patient safety. A factor in the release of potentially dangerous devices might be the interpretations of nonnormal test data by statistically unsophisticated engineers. The purpose of this study was to test the hypothesis that testing by lot provides a better indicator of true process behavior than process capability indices (PCIs) calculated from the mixed lots that often occur in a typical production situation. The foundations of this research were in the prior work of Bertalanffy, Kane, Shewhart, and Taylor. The research questions examined whether lot traceability allows the decomposition of the combination distribution to allow more accurate calculations of PCIs used to monitor medical device production. The study was semiexperimental, using simulated data. While the simulated data were random, the study was a quasiexperimental design because of the control of the simulated data through parameter selection. The results of this study indicate that decomposition does not increase the accuracy of the PCI. The conclusion is that a systems approach using the PCI, additional statistical tools, and expert knowledge could yield more accurate results than could decomposition alone. More accurate results could ensure the production of safer medical devices by correctly identifying noncapable processes (i.e., processes that may not produce required results), while also preventing needless waste of resources and delays in potentially life-savings technology, reaching patients in cases where processes evaluate as noncapable when they are actually capable
    • …
    corecore