14 research outputs found

    Developing Acceptance Sampling Plans based on Incapability Index Cpp

    Full text link

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    Process Capability Calculations with Nonnormal Data in the Medical Device Manufacturing Industry

    Get PDF
    U.S. Food and Drug Administration (FDA) recalls of medical devices are at historically high levels despite efforts by manufacturers to meet stringent agency requirements to ensure quality and patient safety. A factor in the release of potentially dangerous devices might be the interpretations of nonnormal test data by statistically unsophisticated engineers. The purpose of this study was to test the hypothesis that testing by lot provides a better indicator of true process behavior than process capability indices (PCIs) calculated from the mixed lots that often occur in a typical production situation. The foundations of this research were in the prior work of Bertalanffy, Kane, Shewhart, and Taylor. The research questions examined whether lot traceability allows the decomposition of the combination distribution to allow more accurate calculations of PCIs used to monitor medical device production. The study was semiexperimental, using simulated data. While the simulated data were random, the study was a quasiexperimental design because of the control of the simulated data through parameter selection. The results of this study indicate that decomposition does not increase the accuracy of the PCI. The conclusion is that a systems approach using the PCI, additional statistical tools, and expert knowledge could yield more accurate results than could decomposition alone. More accurate results could ensure the production of safer medical devices by correctly identifying noncapable processes (i.e., processes that may not produce required results), while also preventing needless waste of resources and delays in potentially life-savings technology, reaching patients in cases where processes evaluate as noncapable when they are actually capable

    Statistical process control by quantile approach.

    Get PDF
    Most quality control and quality improvement procedures involve making assumptions about the distributional form of data it uses; usually that the data is normally distributed. It is common place to find processes that generate data which is non-normally distributed, e.g. Weibull, logistic or mixture data is increasingly encountered. Any method that seeks to avoid the use of transformation for non-normal data requires techniques for identification of the appropriate distributions. In cases where the appropriate distributions are known it is often intractable to implement.This research is concerned with statistical process control (SPC), where SPC can be apply for variable and attribute data. The objective of SPC is to control a process in an ideal situation with respect to a particular product specification. One of the several measurement tools of SPC is control chart. This research is mainly concerned with control chart which monitors process and quality improvement. We believe, it is a useful process monitoring technique when a source of variability is present. Here, control charts provides a signal that the process must be investigated. In general, Shewhart control charts assume that the data follows normal distribution. Hence, most of SPC techniques have been derived and constructed using the concept of quality which depends on normal distribution. In reality, often the set of data such as, chemical process data and lifetimes data, etc. are not normal. So when a control chart is constructed for x or R, assuming that the data is normal, if in reality, the data is nonnormal, then it will provide an inaccurate results.Schilling and Nelson has (1976) investigated under the central limit theory, the effect of non-normality on charts and concluded that the non-normality is usually not a problem for subgroup sizes of four or more. However, for smaller subgroup sizes, and especially for individual measurements, non-normality can be serious problem.The literature review indicates that there are real problems in dealing with statistical process control for non-normal distributions and mixture distributions. This thesis provides a quantile approach to deal with non-normal distributions, in order to construct median rankit control chart. Here, the quantile approach will also be used to calculate process capability index, average run length (ARL), multivariate control chart and control chart for mixture distribution for non-normal situations. This methodology can be easily adopted by the practitioner of statistical process control

    A New Fuzzy Method for Assessing Six Sigma Measures

    Get PDF
    Six-Sigma has some measures which measure performance characteristics related to a process. In most of the traditional methods, exact estimation is used to assess these measures and to utilize them in practice. In this paper, to estimate some of these measures, including Defects per Million Opportunities (DPMO), Defects per Opportunity (DPO), Defects per unit (DPU) and Yield, a new algorithm based on Buckley's estimation approach is introduced. The algorithm uses a family of confidence intervals to estimate the mentioned measures. The final results of introduced algorithm for different measures are triangular shaped fuzzy numbers. Finally, since DPMO, as one of the most useful measures in Six-Sigma, should be consistent with costumer need, this paper introduces a new fuzzy method to check this consistency. The method compares estimated DPMO with fuzzy customer need. Numerical examples are given to show the performance of the method. All rights reserve

    Products and Services

    Get PDF
    Today’s global economy offers more opportunities, but is also more complex and competitive than ever before. This fact leads to a wide range of research activity in different fields of interest, especially in the so-called high-tech sectors. This book is a result of widespread research and development activity from many researchers worldwide, covering the aspects of development activities in general, as well as various aspects of the practical application of knowledge

    A study assessing the viability of using Fused Filament Fabrication (FFF) Additive Manufacturing (AM) technology to manufacture customised Class I medical devices

    Get PDF
    Additive manufacturing (AM) is becoming an increasingly common manufacturing method for medical devices due to the benefits of advanced customisation, improved fit and opportunities for innovation. However, many AM medical devices remain inaccessible due to high costs of hardware and consumables, and the large infrastructural requirements required for operation. Fused filament fabrication (FFF) is a highly accessible AM technique due to its open-source nature, which has led to an extensive market of affordable desktop 3D printers. In this work FFF has been demonstrated as a potentially viable technique to fabricate low-risk medical devices in two case studies presented in this thesis: a customised daily living aid and a range of medical devices in response to the COVID-19 pandemic. Although the potential of the technology has been demonstrated, research around the practical suitability of FFF for medical applications remained limited, with much of the research in the field focussing on proof-of-concept applications, which did not explore the necessary requirements for the integration of the technology into daily clinical practices. This thesis investigates the fundamental requirements of the FFF AM technique for it to be used for Class I medical device applications in three identified use cases: non-specialist, research and industrial use. In keeping with the ambition for FFF to provide accessible solutions, mid-range hardware aimed at professional printing applications was selected to carry out this work, which encompasses the activities present in each of the three identified use cases. A methodology was presented to determine the repeatability and reproducibility of FFF across three potential use cases, which revealed varying process capability between the X-, Y- and Z- printing directions for individual machines, and significant variation between multiple machines of the same make and model. The repeatability and reproducibility of the FFF technique was identified as a key limitation for the widespread adoption of FFF technology for specialist and industrial use. The smallest tolerance achieved from a professional desktop FFF printer was 0.3mm in both the X- and Y- directions, and 0.4mm in the Z-direction. Additional variable factors were studied, including the condition of filament with respect to its storage environment and duration of storage, the influence of different colours and pigments present in filament and the use of an air management add-on unit intended to enhance the hardware. The glass transition temperature of Tough PLA remained largely unaffected from variable storage conditions, which when submerged in water decreased by around 1.4ÂşC from that of ambiently stored filament. The mechanical properties of printed parts were influenced by filament colour, with white filament producing parts with increased elongation and tensile strength than other colours studied. Dimensional accuracy in the Z-printing direction was affected by air management, where samples produced with air management were measured higher than the nominal value, and without air management lower than the nominal value. This thesis is the first known work to explore the suitability of FFF technology for Class I medical devices, from the perspective of both specialist and non-specialist users. The key barriers to widespread adoption were identified as the repeatability and reproducibility of the technique, and the influence of variable factors on the process and part performance. The exploration of these continually referenced medical device regulations, whilst consideration was given to how the experimental work can be applied to real-world Class I medical device manufacturing applications

    Modeling and Optimization of Stochastic Process Parameters in Complex Engineering Systems

    Get PDF
    For quality engineering researchers and practitioners, a wide number of statistical tools and techniques are available for use in the manufacturing industry. The objective or goal in applying these tools has always been to improve or optimize a product or process in terms of efficiency, production cost, or product quality. While tremendous progress has been made in the design of quality optimization models, there remains a significant gap between existing research and the needs of the industrial community. Contemporary manufacturing processes are inherently more complex - they may involve multiple stages of production or require the assessment of multiple quality characteristics. New and emerging fields, such as nanoelectronics and molecular biometrics, demand increased degrees of precision and estimation, that which is not attainable with current tools and measures. And since most researchers will focus on a specific type of characteristic or a given set of conditions, there are many critical industrial processes for which models are not applicable. Thus, the objective of this research is to improve existing techniques by not only expanding their range of applicability, but also their ability to more realistically model a given process. Several quality models are proposed that seek greater precision in the estimation of the process parameters and the removal of assumptions that limit their breadth and scope. An extension is made to examine the effectiveness of these models in both non-standard conditions and in areas that have not been previously investigated. Upon the completion of an in-depth literature review, various quality models are proposed, and numerical examples are used to validate the use of these methodologies
    corecore