1,607 research outputs found

    Acceptance sampling plan for multiple manufacturing lines using EWMA process capability index

    Get PDF
    The problem of developing a product acceptance determination procedure for multiple characteristics has attracted the quality assurance practitioners. Due to sufficient demands of consumers, it may not be possible to deliver the quantity ordered on time using the process based on one manufacturing line. So, in factories, product is manufactured using multiple manufacturing lines and combine it. In this manuscript, we present the designing of an acceptance sampling plan for products from multiple independent manufacturing lines using exponentially weighted moving average (EWMA) statistic of the process capability index. The plan parameters such as the sample size and the acceptance number will be determined by satisfying both the producer's and the consumer's risks. The efficiency of the proposed plan will be discussed over the existing sampling plan. The tables are given for industrial use and explained with the help of industrial examples. We conclude that the use of the proposed plan in these industries minimizes the cost and time of inspection. Smaller the sample size means low inspection cost. The proposed plan for some non-normal distributions can be extended as a future research. The determination of sampling plan using cost model is also interested area for the future research. ? 2017 The Japan Society of Mechanical Engineers.11Ysciescopu

    Developing Acceptance Sampling Plans based on Incapability Index Cpp

    Full text link

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    Systems Engineering

    Get PDF
    The book "Systems Engineering: Practice and Theory" is a collection of articles written by developers and researches from all around the globe. Mostly they present methodologies for separate Systems Engineering processes; others consider issues of adjacent knowledge areas and sub-areas that significantly contribute to systems development, operation, and maintenance. Case studies include aircraft, spacecrafts, and space systems development, post-analysis of data collected during operation of large systems etc. Important issues related to "bottlenecks" of Systems Engineering, such as complexity, reliability, and safety of different kinds of systems, creation, operation and maintenance of services, system-human communication, and management tasks done during system projects are addressed in the collection. This book is for people who are interested in the modern state of the Systems Engineering knowledge area and for systems engineers involved in different activities of the area. Some articles may be a valuable source for university lecturers and students; most of case studies can be directly used in Systems Engineering courses as illustrative materials

    Doctor of Philosophy

    Get PDF
    dissertationIn order to ensure high production yield of semiconductor devices, it is desirable to characterize intermediate progress towards the final product by using metrology tools to acquire relevant measurements after each sequential processing step. The metrology data are commonly used in feedback and feed-forward loops of Run-to-Run (R2R) controllers to improve process capability and optimize recipes from lot-to-lot or batch-to-batch. In this dissertation, we focus on two related issues. First, we propose a novel non-threaded R2R controller that utilizes all available metrology measurements, even when the data were acquired during prior runs that differed in their contexts from the current fabrication thread. The developed controller is the first known implementation of a non-threaded R2R control strategy that was successfully deployed in the high-volume production semiconductor fab. Its introduction improved the process capability by 8% compared with the traditional threaded R2R control and significantly reduced out of control (OOC) events at one of the most critical steps in NAND memory manufacturing. The second contribution demonstrates the value of developing virtual metrology (VM) estimators using the insight gained from multiphysics models. Unlike the traditional statistical regression techniques, which lead to linear models that depend on a linear combination of the available measurements, we develop VM models, the structure of which and the functional interdependence between their input and output variables are determined from the insight provided by the multiphysics describing the operation of the processing step for which the VM system is being developed. We demonstrate this approach for three different processes, and describe the superior performance of the developed VM systems after their first-of-a-kind deployment in a high-volume semiconductor manufacturing environment

    A Variable Acceptance Sampling Plan under Neutrosophic Statistical Interval Method

    Get PDF
    The acceptance sampling plan plays an important role in maintaining the high quality of a product. The variable control chart, using classical statistics, helps in making acceptance or rejection decisions about the submitted lot of the product. Furthermore, the sampling plan, using classical statistics, assumes the complete or determinate information available about a lot of product. However, in some situations, data may be ambiguous, vague, imprecise, and incomplete or indeterminate. In this case, the use of neutrosophic statistics can be applied to guide the experimenters. In this paper, we originally proposed a new variable sampling plan using the neutrosophic interval statistical method. The neutrosophic operating characteristic (NOC) is derived using the neutrosophic normal distribution. The optimization solution is also presented for the proposed plan under the neutrosophic interval method. The effectiveness of the proposed plan is compared with the plan under classical statistics. The tables are presented for practical use and a real example is given to explain the neutrosophic fuzzy variable sampling plan in the industry

    A multi-dimensional spectral description of ocean variability with applications

    Get PDF
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution February 2013Efforts to monitor the ocean for signs of climate change are hampered by ever-present noise, in the form of stochastic ocean variability, and detailed knowledge of the character of this noise is necessary for estimating the significance of apparent trends. Typically, uncertainty estimates are made by a variety of ad hoc methods, often based on numerical model results or the variability of the data set being analyzed. We provide a systematic approach based on the four-dimensional frequency-wavenumber spectrum of low-frequency ocean variability. This thesis presents an empirical model of the spectrum of ocean variability for periods between about 20 days and 15 years and wavelengths of about 200{10,000 km, and describes applications to ocean circulation trend detection, observing system design, and satellite data processing. The horizontal wavenumber-frequency part of the model spectrum is based on satellite altimetry, current meter data, moored temperature records, and shipboard ADCP data. The spectrum is dominated by motions along a "nondispersive line". The observations considered are consistent with a universal Ļ‰-2 power law at the high end of the frequency range, but inconsistent with a universal wavenumber power law. The model spectrum is globally varying and accounts for changes in dominant phase speed, period, and wavelength with location. The vertical structure of the model spectrum is based on numerical model results, current meter data, and theoretical considerations. We find that the vertical structure of kinetic energy is surface intensified relative to the simplest theoretical predictions. We present a theory for the interaction of linear Rossby waves with rough topography; rough topography can explain both the observed phase speeds and vertical structure of variability. The improved description of low-frequency ocean variability presented here will serve as a useful tool for future oceanographic studies.This research was supported by NASA under grants NNG06GC28G and NNX08AR33G

    Simulation and Control of Univariate and Multivariate Set-Up Dominant Process

    Get PDF
    This thesis explores the use of statistically valid process improvement tools in low-volume applications. Setting out the following research questions: How can the Six Sigma Measure and Analyse phases of a chronic quality problem be statistically validated in a low-volume process? How can a statistically valid approach for process control be implemented in a low-volume process? And how can this tool be extended to ļ¬t multivariate processes and can the calculation of control parameter adjustments be automated? In answer, the thesis presents an enhanced PROcess VAriation Diagnosis Tool (PROVADT) method, driving a Six Sigma improvement project through the Measure and Analyse phases. PROVADT provides a structured sampling plan to perform a Multi-Vari study, Isoplot, Gage R&R and Provisional Process Capability in as few as twenty samples and eighty measurements, making the technique suited to low-volume applications. The enhanced PROVADT method provides a Gage R&R without confounded variation sources, as was the case in the original method, and its practical application was demonstrated through two case studies. Process control tools for low-volume, high-variety manufacturing applications were developed. An adjustable traļ¬ƒc-light chart, with control limits linked to tolerance and simple decision rules, was used for monitoring univariate processes. This tool, the Set-Up Process Algorithm (SUPA), uses probability theory to provide 98% conļ¬dence that the process is operating at a pre-speciļ¬ed minimum level of Cp in as few as ļ¬ve samples. SUPA was extended to deal with high-complexity applications, resulting in multivariate SUPA (mSUPA). mSUPA maintains SUPAā€™s principles, but presents the information about multiple process features on one chart, rather than multiple univariate charts. To supplement the mSUPA tool, a theoretical method for calculating optimal process adjustment when a multivariate process is oļ¬€-target was introduced, combining discrete-event simulation and numerical optimisation to calculate adjustments

    Near Real-Time Optimal Prediction of Adverse Events in Aviation Data

    Get PDF
    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we demonstrate how to recast the anomaly prediction problem into a form whose solution is accessible as a level-crossing prediction problem. The level-crossing prediction problem has an elegant, optimal, yet untested solution under certain technical constraints, and only when the appropriate modeling assumptions are made. As such, we will thoroughly investigate the resilience of these modeling assumptions, and show how they affect final performance. Finally, the predictive capability of this method will be assessed by quantitative means, using both validation and test data containing anomalies or adverse events from real aviation data sets that have previously been identified as operationally significant by domain experts. It will be shown that the formulation proposed yields a lower false alarm rate on average than competing methods based on similarly advanced concepts, and a higher correct detection rate than a standard method based upon exceedances that is commonly used for prediction

    Design of Variables Sampling Plan in Agro-Allied Industry for Packed Yam Flour in-view of International Regulatory Standards

    Get PDF
    To stay competitive in todayā€™s global market, manufacturers must ensure products released to consumers, meet international regulations and standards and this can only be done by putting in place sampling plans to guarantee the release of quality lots of products into the market. Hence, the objective of this paper is to design a variables sampling plan for the released of packed yam flour in view of international regulations on the net content of packaged goods. Probability plots, operating characteristic curves, the average outgoing quality (AOQ), average outgoing quality limit (AOQL) and average total inspection (ATI) were useful measures to evaluate the fitness of the sampling plan using the Minitab 2021 statistical software package. The packing process net weight, was found to be normally distributed with a p-value of 0.075 and a process standard deviation of 2.16. A comparative analysis on sample size, sampling plan measures, such as the AOQ, AOQL, and ATI and in view of best practice, were decisive in selecting a sampling plan with a sample size of 31 packs per lot as the most economic plan for lot sentencing. A practical demonstration on this sampling plan usage was also showcased. This sampling plan elevates and improves the net content of the packed product released into the market in view of international regulatory laws
    • ā€¦
    corecore