234 research outputs found

    Computational Procedure of Performance Assessment of Lifetime Index of Products for the Weibull Distribution with the Progressive First-Failure-Censored Sampling Plan

    Get PDF
    Process capability analysis has been widely applied in the field of quality control to monitor the performance of industrial processes. In practice, lifetime performance index CL is a popular means to assess the performance and potential of their processes, where L is the lower specification limit. This study will apply the large-sample theory to construct a maximum likelihood estimator (MLE) of CL with the progressive first-failure-censored sampling plan under the Weibull distribution. The MLE of CL is then utilized to develop a new hypothesis testing procedure in the condition of known L

    Data Analysis and Experimental Design for Accelerated Life Testing with Heterogeneous Group Effects

    Get PDF
    abstract: In accelerated life tests (ALTs), complete randomization is hardly achievable because of economic and engineering constraints. Typical experimental protocols such as subsampling or random blocks in ALTs result in a grouped structure, which leads to correlated lifetime observations. In this dissertation, generalized linear mixed model (GLMM) approach is proposed to analyze ALT data and find the optimal ALT design with the consideration of heterogeneous group effects. Two types of ALTs are demonstrated for data analysis. First, constant-stress ALT (CSALT) data with Weibull failure time distribution is modeled by GLMM. The marginal likelihood of observations is approximated by the quadrature rule; and the maximum likelihood (ML) estimation method is applied in iterative fashion to estimate unknown parameters including the variance component of random effect. Secondly, step-stress ALT (SSALT) data with random group effects is analyzed in similar manner but with an assumption of exponentially distributed failure time in each stress step. Two parameter estimation methods, from the frequentist’s and Bayesian points of view, are applied; and they are compared with other traditional models through simulation study and real example of the heterogeneous SSALT data. The proposed random effect model shows superiority in terms of reducing bias and variance in the estimation of life-stress relationship. The GLMM approach is particularly useful for the optimal experimental design of ALT while taking the random group effects into account. In specific, planning ALTs under nested design structure with random test chamber effects are studied. A greedy two-phased approach shows that different test chamber assignments to stress conditions substantially impact on the estimation of unknown parameters. Then, the D-optimal test plan with two test chambers is constructed by applying the quasi-likelihood approach. Lastly, the optimal ALT planning is expanded for the case of multiple sources of random effects so that the crossed design structure is also considered, along with the nested structure.Dissertation/ThesisDoctoral Dissertation Industrial Engineering 201

    Vol. 15, No. 2 (Full Issue)

    Get PDF

    Vol. 13, No. 1 (Full Issue)

    Get PDF

    A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes

    Get PDF
    Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime

    Contributions to planning and analysis of accelerated testing

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Modeling Reliability Growth in Accelerated Stress Testing

    Get PDF
    Qualitative accelerated test methods improve system reliability by identifying and removing initial design flaws. However, schedule and cost constraints often preclude sufficient testing to generate a meaningful reliability estimate from the data obtained in these tests. In this dissertation a modified accelerated life test is proposed to assess the likelihood of attaining a reliability requirement based on tests of early system prototypes. Assuming each prototype contains an unknown number of independent competing failure modes whose respective times to occurrence are governed by a distinct Weibull law, the observed failure data from this qualitative test are shown to follow a poly-Weibull distribution. However, using an agent-based Monte Carlo simulation, it is shown that for typical products subjected to qualitative testing, the failure observations result from a homogenous subset of the total number of latent failure modes and the failure data can be adequately modeled with a Weibull distribution. Thus, the projected system reliability after implementing corrective action to remove one or more failure modes can be estimated using established quantitative accelerated test data analysis methods. Our results suggest that a significant cost and time savings may be realized using the proposed method to signal the need to reassess a product’s design or reallocate test resources to avoid unnecessary maintenance or redesigns. Further, the proposed approach allows a significant reduction in the test time and sample size required to estimate the risk of meeting a reliability requirement over current quantitative accelerated life test techniques. Additional contributions include a numerical and analytical procedure for obtaining the maximum likelihood parameter estimates and observed Fisher information matrix components for the generalized poly-Weibull distribution. Using this procedure, we show that the poly-Weibull distribution outperforms the best-fit modified Weibull alternatives in the literature with respect to their fit of reference data sets for which the hazard rate functions are non-monotone

    Contrubutions to the Analysis of Multistate and Degradation Data

    Full text link
    Traditional methods in survival, reliability, actuarial science, risk, and other event-history applications are based on the analysis of time-to-occurrence of some event of interest, generically called ``failure''. In the presence of high-degrees of censoring, however, it is difficult to make inference about the underlying failure distribution using failure time data. Moreover, such data are not very useful in predicting failures of specific systems, a problem of interest when dealing with expensive or critical systems. As an alternative, there is an increasing trend towards collecting and analyzing richer types of data related to the states and performance of systems or subjects under study. These include data on multistate and degradation processes. This dissertation makes several contributions to the analysis of multistate and degradation data. The first part of the dissertation deals with parametric inference for multistate processes with panel data. These include interval, right, and left censoring, which arise naturally as the processes are not observed continuously. Most of the literature in this area deal with Markov models, for which inference with censored data can be handled without too much difficulty. The dissertation considers progressive semi-Markov models and develops methods and algorithms for general parametric inference. A combination of Markov Chain Monte Carlo techniques and stochastic approximation methods are used. A second topic deals with the comparison of the traditional method and the process method for inference about the time-to-failure distribution in the presence of multistate data. Here, time-to-failure is the time when the process enters an absorbing state. There is limited literature in this area. The gains in both estimation and prediction efficiency are quantified for various parametric models of interest. The second part of the dissertation deals with the analysis of data on continuous measures of performance and degradation with missing data. In this case, time-to-failure is the time at which the degradation measure exceeds a certain threshold or performance level goes below some threshold. Inference problems about the mean and variance of the degradation and the imputation of the missing are studied under different settings.Ph.D.StatisticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/86286/1/yangcn_1.pd

    Probabilistic Models for Life Cycle Management of Energy Infrastructure Systems

    Get PDF
    The degradation of aging energy infrastructure systems has the potential to increase the risk of failure, resulting in power outage and costly unplanned maintenance work. Therefore, the development of scientific and cost-effective life cycle management (LCM) strategies has become increasingly important to maintain energy infrastructure. Since degradation of aging equipment is an uncertain process which depends on many factors, a risk-based approach is required to consider the effect of various uncertainties in LCM. The thesis presents probabilistic models to support risk-based life cycle management of energy infrastructure systems. In addition to uncertainty in degradation process, the inspection data collected by the energy industry is often censored and truncated which make it difficult to estimate the lifetime probability distribution of the equipment. The thesis presents modern statistical techniques in quantifying uncertainties associated with inspection data and to estimate the lifetime distributions in a consistent manner. Age-based and sequential inspection-based replacement models are proposed for maintenance of component in a large-distribution network. A probabilistic lifetime model to consider the effect of imperfect preventive maintenance of a component is developed and its impact to maintenance optimization is illustrated. The thesis presents a stochastic model for the pitting corrosion process in steam generators (SG), which is a serious form of degradation in SG tubing of some nuclear generating stations. The model is applied to estimate the number of tubes requiring plugging and the probability of tube leakage in an operating period. The application and benefits of the model are illustrated in the context of managing the life cycle of a steam generator
    corecore