47 research outputs found

    Developing Acceptance Sampling Plans based on Incapability Index Cpp

    Full text link

    Acceptance sampling plan for multiple manufacturing lines using EWMA process capability index

    Get PDF
    The problem of developing a product acceptance determination procedure for multiple characteristics has attracted the quality assurance practitioners. Due to sufficient demands of consumers, it may not be possible to deliver the quantity ordered on time using the process based on one manufacturing line. So, in factories, product is manufactured using multiple manufacturing lines and combine it. In this manuscript, we present the designing of an acceptance sampling plan for products from multiple independent manufacturing lines using exponentially weighted moving average (EWMA) statistic of the process capability index. The plan parameters such as the sample size and the acceptance number will be determined by satisfying both the producer's and the consumer's risks. The efficiency of the proposed plan will be discussed over the existing sampling plan. The tables are given for industrial use and explained with the help of industrial examples. We conclude that the use of the proposed plan in these industries minimizes the cost and time of inspection. Smaller the sample size means low inspection cost. The proposed plan for some non-normal distributions can be extended as a future research. The determination of sampling plan using cost model is also interested area for the future research. ? 2017 The Japan Society of Mechanical Engineers.11Ysciescopu

    Sampling Plan Using Process Loss Index Using Multiple Dependent State Sampling Under Neutrosophic Statistics

    Get PDF
    This paper presents the designing of a sampling plan using the process loss consideration for the multiple dependent state sampling under the neutrosophic statistics. The operating characteristics under the neutrosophic statistical interval method (NSIM) are developed to find the neutrosophic plan parameters of the proposed sampling plan. A non-linear optimization under NSIM is used to find the optimal neutrosophic plan parameters under the given conditions. The advantages of the proposed sampling plan are discussed over the existing sampling plans. A real example having some uncertain observations is given for the illustration purpose

    Contributions in statistical process control for high quality products

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Bell-Touchard-G Family of Distributions: Applications to Quality Control and Actuarial Data

    Get PDF
    In this article, we developed a new statistical model named as the generalized complementary exponentiated Bell-Touchard model. The exponential model is taken as a special baseline model with a configurable failure rate function. The proposed model is based on several features of zero-truncated Bell numbers and Touchard polynomials that can address the complementary risk matters. The linear representation of the density of the proposed model is provided that can be used to obtain numerous important properties of the special model. The well-known actuarial metrics namely value at risk and expected shortfall are formulated, computed and graphically illustrated for the sub model. The maximum likelihood approach is used to estimate the parameters. Furthermore, we designed the group acceptance sampling plan for the proposed model by using the median as a quality parameter for truncated life tests. Three real data applications are offered for the complementary exponentiated Bell Touchard exponential model. The analysis of the two failure times data and comparative study yielded optimized results of the group acceptance sampling plan under the proposed model. The application to insurance claim data also provided the best results and showed that the proposed model had heavier tail

    Mass Production Processes

    Get PDF
    It is always hard to set manufacturing systems to produce large quantities of standardized parts. Controlling these mass production lines needs deep knowledge, hard experience, and the required related tools as well. The use of modern methods and techniques to produce a large quantity of products within productive manufacturing processes provides improvements in manufacturing costs and product quality. In order to serve these purposes, this book aims to reflect on the advanced manufacturing systems of different alloys in production with related components and automation technologies. Additionally, it focuses on mass production processes designed according to Industry 4.0 considering different kinds of quality and improvement works in mass production systems for high productive and sustainable manufacturing. This book may be interesting to researchers, industrial employees, or any other partners who work for better quality manufacturing at any stage of the mass production processes

    A study of modelling and monitoring time-between-events with control charts

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    The design of a data model (DM) for managing durability index (DI) results for national road infrastructure

    Get PDF
    As part of a R 1.14 Billion 64-month concrete construction mega-project which began in May 2013, the Mt Edgecombe Interchange, comprising two incrementally launched bridges, the longest at 948 metres long and the other at 440 metres which joins uMhlanga and the N2 North, necessitates the demand to have adequate systems in place to measure durability compliance. Construction contracts of this nature exhibit thousands of test results that need to be assessed for variability, outliers and compliance for quality assurance in line with current performance-based specifications such as those contained in COTO (2018a; 2018b) derived from COLTO (1998) which requires judgement based on statistical principles. Since the inception of Durability Index (DI) performance-based specifications in 2008, over 12000 DI test results or determinations have accumulated within a repository at the University of Cape Town. As such, the performance-based approach in South Africa is now a decade into maturity and considerable amounts of actual site data are collected daily, and significant for refinements of the DI values in performance-based specifications, the long-term monitoring of Reinforced Concrete (RC) structures in a full-scale environment along with other research and development (R&D) initiatives. Data modelling can be defined as the process of designing a data model (DM) for data to be stored in a database. Commonly, a DM can be designated into three main types. A conceptual DM defines what the system contains; a logical DM defines how the system should be executed regardless of the Database Management System (DBMS); and a physical DM describes how the system will be executed using a specific DBMS system. The main objective of this study is to design a data model (DM) that is essentially a conceptual and logical representation of the physical database required to ensure durability compliance for RC structures. Database design principles are needed to execute a good database design and guide the entire process. Duplicate information or redundant data consume unnecessary storage as well as increase the probability of errors and inconsistencies. Therefore, the subdivision of the data within the conceptual data model (DM) into distinct groups or topics, which are broken down further into subject based tables, will help eliminate redundant data. The data contained within the database must be correct and complete. Incorrect or incomplete information will result in reports with mistakes and as such, any decisions based on the data will be misinformed. Therefore, the database must support and ensure the accuracy and integrity of the information as well as accommodate data processing and reporting requirements. An explanation and critique of the current durability specification has also been presented since information is required on how to join information in the database tables to create meaningful output. The conceptual data model (DM) established the basic concepts and the scope for the physical database through designing a modular structure or general layout for the database. This process established the entities or data objects (distinct groups), their attributes (properties of distinct groups) and their relationship (dependency of association between groups). The logical database design phase is divided into two main steps. In the first step, a data model (DM) is created to ensure minimal redundancy and capability for supporting user transactions. The output of this step is the creation of a logical data model (DM), which is a complete and accurate representation of the topics that are to be supported by the database. In the second step, the Entity Relationship Diagram (ERD) is mapped to a set of tables. The structure of each table is checked using normalization. Normalization is an effective means of ensuring that the tables are structurally consistent, logical, with minimal redundancy. The tables were also checked to ensure that they are capable of supporting the required transactions and the required integrity constraints on the database were defined The logical data model (DM) then added extra information to the conceptual data model (DM) elements through defining the database tables or basic information required for the physical database. This process established the structure of the data elements, set relationships between them and provided foundation to form the base for the physical database. A prototype is presented of the designed data model (DM) founded on 53 basic information database tables. The breakdown of database tables for the six modules is split according to references (1), concrete composition (13), execution (4), environment (7), specimens (2) and material tests (26). Correlations between different input parameters were identified which added further information to the logical data model (DM) elements by strengthening the relations between the topics. The extraction of information or output parameters according to specification limits was conducted through analysing data from five different projects which served as input for a total of 1054 DI test results or 4216 determinations. The results were used to conduct parametric studies on the DI values which predominantly affects concrete durability in RC structures. Lastly, a method is proposed using joint probability density functions of Durability Index (DI) test results and the achieved cover depth to calculate the probability that both random variables are out of specification limits

    Control of quality of performance at a primary stock point in the U.S. Navy's supply system.

    Get PDF
    http://www.archive.org/details/controlofquality00carrU.S. Navy (U.S.N.) author
    corecore