910 research outputs found

    Acceptance sampling plan for multiple manufacturing lines using EWMA process capability index

    Get PDF
    The problem of developing a product acceptance determination procedure for multiple characteristics has attracted the quality assurance practitioners. Due to sufficient demands of consumers, it may not be possible to deliver the quantity ordered on time using the process based on one manufacturing line. So, in factories, product is manufactured using multiple manufacturing lines and combine it. In this manuscript, we present the designing of an acceptance sampling plan for products from multiple independent manufacturing lines using exponentially weighted moving average (EWMA) statistic of the process capability index. The plan parameters such as the sample size and the acceptance number will be determined by satisfying both the producer's and the consumer's risks. The efficiency of the proposed plan will be discussed over the existing sampling plan. The tables are given for industrial use and explained with the help of industrial examples. We conclude that the use of the proposed plan in these industries minimizes the cost and time of inspection. Smaller the sample size means low inspection cost. The proposed plan for some non-normal distributions can be extended as a future research. The determination of sampling plan using cost model is also interested area for the future research. ? 2017 The Japan Society of Mechanical Engineers.11Ysciescopu

    Method Development and Validation of Total Mercury Content in Effluent Wastewater by Cold Vapor Atomic Fluorescence Spectroscopy

    Get PDF
    The presence and concentration of ambient mercury contamination in our natural environment and workplaces will continue to be closely monitored and regulated as it imposes grave implications and serious risks to human health. Ongoing quantitative analysis has already become a routine part of industrial chemical plants’ in-process and end-stage testing. Mercury contamination in waste generated by these chemical processes can present substantial operational hurdles, as compliance must be demonstrated by treatment, accurate measurement, and timely reporting of waste materials against stringently low limits before release into natural bodies of water or the municipal water supply. An accurate and reliable low-level method of analysis for the chemical detection and quantitation of total mercury content in effluent wastewater from an industrial chemical plant has been validated and deemed suitable for its intended use. The method validation parameters included an assessment of selectivity, linearity (range), accuracy, precision, and robustness. Acceptable system suitability and sample results for all experiments were demonstrated according to current acceptable practices and limits laid forth in a pre-determined validation plan

    An integrated process for forming manufacturing technology acquisition decisions

    Get PDF
    The right manufacturing technology at the right time can enable an organisation to produce products that are cheaper, better, and made faster than those of the competition. Paradoxically, the wrong technology, or even the right technology poorly implemented, can be disastrous. The decision process through which practitioners acquire manufacturing technologies can significantly impact on their eventual capabilities and performance. This complete process has unfortunately received limited attention in previous studies. Therefore, the work presented in this paper has investigated leading research and industrial practices to create a formal and rational decision process, and then evaluated this through an extended and in-depth case study of a manufacturing technology acquisition. An analysis of previous literature, industrial practices, and the resulting decision process are all presented in this paper

    Conformability analysis for the control of quality costs in electronic systems

    Get PDF
    The variations embodied in the production of electronic systems can cause that system to fail to conform to its specification with respect to Critical to Quality features. As a consequence of such failures the system manufacture may incur significant quality costs ranging from simple warranty returns up to legal liabilities. It can be difficult to determine both the probability that a system will fail to meet its specification and estimate the associated cost of failure. This thesis presents the Electronic Conformability Analysis (eCA) technique a novel methodology and supporting tool set for the assessment and control of quality costs associated with electronic systems. The technique addresses the three main elements of production affecting quality costs associated with electronic systems which are functionality, manufacturability and testability. Electronic Conformability Analysis combines statistical performance exploration with process capability indices, a modified form of Failure Modes and Effects Analysis and a cost mapping procedure. The technique allows the quality costs associated with design and manufacture induced failures to be assessed and the effectiveness of test strategies in reducing these costs to be determined. Through this analysis of costs the technique allows the potential trade-offs between these costs and those associated with design and process modifications to be explored. In support of the Electronic Conformability Analysis technique a number of new analysis tools have been developed. These tools enable the methodology to cope with the specific difficulties associated with the analysis of electronic systems. The technique has been applied to a number of analogue and mixed signal, safety critical circuits from automotive systems. These case studies have included several different levels of system complexity ranging from relatively simple transistor circuits to highly complex mechatronic systems. These case studies have shown that the technique is effective in a commercial design and manufacturing environment

    Design of Variables Sampling Plan in Agro-Allied Industry for Packed Yam Flour in-view of International Regulatory Standards

    Get PDF
    To stay competitive in today’s global market, manufacturers must ensure products released to consumers, meet international regulations and standards and this can only be done by putting in place sampling plans to guarantee the release of quality lots of products into the market. Hence, the objective of this paper is to design a variables sampling plan for the released of packed yam flour in view of international regulations on the net content of packaged goods. Probability plots, operating characteristic curves, the average outgoing quality (AOQ), average outgoing quality limit (AOQL) and average total inspection (ATI) were useful measures to evaluate the fitness of the sampling plan using the Minitab 2021 statistical software package. The packing process net weight, was found to be normally distributed with a p-value of 0.075 and a process standard deviation of 2.16. A comparative analysis on sample size, sampling plan measures, such as the AOQ, AOQL, and ATI and in view of best practice, were decisive in selecting a sampling plan with a sample size of 31 packs per lot as the most economic plan for lot sentencing. A practical demonstration on this sampling plan usage was also showcased. This sampling plan elevates and improves the net content of the packed product released into the market in view of international regulatory laws

    Selection of an alternative production part approval process to improve weapon systems production readiness

    Get PDF
    This thesis conducted an examination related to the Department of Defense (DOD) weapons systems production approval practices. Current practices result in poor weapons system production outcomes that reduce fleet readiness in DOD weapons systems acquisition. The Government Accountability Office (GAO) has reported concerns related to a lack of manufacturing knowledge at production start as causal to poor production outcomes. A comparison of DOD practices against non-DOD industrial production approval processes addressing causality and improvement opportunity provided new insight not found in acquisition research. An analysis of alternatives identified best practices to improve production capability and readiness. Key findings revealed that the automotive production approval process followed industry best practices that fully addressed problems identified by the GAO. Non-DOD industries used a more prescriptive Quality Management System (QMS) that enabled a more disciplined manufacturing development and demonstration of production capability prior to production commitment. Commercial surveys in the literature confirmed the benefits of the automotive prescriptive QMS. The more successful QMS approach can be applied to DOD acquisition practices reducing costs and improving fleet readiness.http://archive.org/details/selectionofnlter1094556139Civilian, Department of the NavyApproved for public release; distribution is unlimited

    A penalty function method for constrained molecular dynamics

    Get PDF
    We propose a penalty-function method for constrained molecular dynamic simulation by defining a quadratic penalty function for the constraints. The simulation with such a method can be done by using a conventional, unconstrained solver only with the penalty parameter increased in an appropriate manner as the simulation proceeds. More specifically, we scale the constraints with their force constants when forming the penalty terms. The resulting force function can then be viewed as a smooth continuation of the original force field as the penalty parameter increases. The penalty function method is easy to implement and costs less than a Lagrange multiplier method, which requires the solution of a nonlinear system of equations in every time step. We have first implemented a penalty function method in CHARMM and applied it to protein Bovine Pancreatic Trypsin Inhibitor (BPTI). We compared the simulation results with Verlet and Shake, and found that the penalty function method had high correlations with Shake and outperformed Verlet. In particular, the RMSD fluctuations of backbone and non-backbone atoms and the velocity auto correlations of Ca atoms of the protein calculated by the penalty function method agreed well with those by Shake. We have also tested the method on a group of argon clusters constrained with a set of inter-atomic distances in their global energy minimum states. The results showed that the method was able to impose the constraints effectively and the clusters tended to converge to their energy minima more rapidly than not confined by the constraints

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    The Future of the Past: Revisionism and Vietnam

    Get PDF
    The first issue of Vietnam Generation: A Journal of Recent History and Contemporary Issues, edited by Kali Tal

    Products and Services

    Get PDF
    Today’s global economy offers more opportunities, but is also more complex and competitive than ever before. This fact leads to a wide range of research activity in different fields of interest, especially in the so-called high-tech sectors. This book is a result of widespread research and development activity from many researchers worldwide, covering the aspects of development activities in general, as well as various aspects of the practical application of knowledge
    corecore