1,318 research outputs found

    Threshold Regression for Survival Analysis: Modeling Event Times by a Stochastic Process Reaching a Boundary

    Full text link
    Many researchers have investigated first hitting times as models for survival data. First hitting times arise naturally in many types of stochastic processes, ranging from Wiener processes to Markov chains. In a survival context, the state of the underlying process represents the strength of an item or the health of an individual. The item fails or the individual experiences a clinical endpoint when the process reaches an adverse threshold state for the first time. The time scale can be calendar time or some other operational measure of degradation or disease progression. In many applications, the process is latent (i.e., unobservable). Threshold regression refers to first-hitting-time models with regression structures that accommodate covariate data. The parameters of the process, threshold state and time scale may depend on the covariates. This paper reviews aspects of this topic and discusses fruitful avenues for future research.Comment: Published at http://dx.doi.org/10.1214/088342306000000330 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A practical contribution to quantitative accelerated testing of multi-failure mode products under multiple stresses

    Get PDF
    La mise en place d'un programme de tests accélérés (AT) est accompagnée de plusieurs préoccupations et incertitudes quant à l'estimation de la fiabilité qui peut causer un écart par rapport au service réel. Cette thèse vise à présenter les outils nécessaires et auxiliaires antérieurs aux tests, ainsi qu'à proposer des approches techniques et des analyses pour la mise en oeuvre de tests accélérés pour l'estimation de la fiabilité, la cornparaison de produits, l'identification des modes de défaillances critiques ainsi que la vérification de l'amélioration de la fiabilité (après modification de la conception). Tout programme de tests accélérés doit faire l'objet d'une investigation économique, de même que la similitude entre tests et modes de défaillances doit être vérifiée. L'existence de variables aléatoires dans le service en utilisant le profil et le temps de défaillance dans les tests accélérés sont les causes de l'incertitude pour estimer la fiabilité qui doit être résolu numériquement. La plupart des programmes de tests de dégradation accélérés ont été mis en oeuvre à des fins qualitatives et d'analyse de comparaison, de sorte que le concept de tests de dégradation accélérés doivent être étendus et généralisés au cas de produits sujets à de multiples modes de défaillance, avec ou sans modes de défaillance dépendants. Si des échantillons, neufs ou usagés, d'un produit sont disponibles; la méthode de vieillissement partielle est proposée afin de diminuer considérablement le temps de test

    Modeling Reliability Growth in Accelerated Stress Testing

    Get PDF
    Qualitative accelerated test methods improve system reliability by identifying and removing initial design flaws. However, schedule and cost constraints often preclude sufficient testing to generate a meaningful reliability estimate from the data obtained in these tests. In this dissertation a modified accelerated life test is proposed to assess the likelihood of attaining a reliability requirement based on tests of early system prototypes. Assuming each prototype contains an unknown number of independent competing failure modes whose respective times to occurrence are governed by a distinct Weibull law, the observed failure data from this qualitative test are shown to follow a poly-Weibull distribution. However, using an agent-based Monte Carlo simulation, it is shown that for typical products subjected to qualitative testing, the failure observations result from a homogenous subset of the total number of latent failure modes and the failure data can be adequately modeled with a Weibull distribution. Thus, the projected system reliability after implementing corrective action to remove one or more failure modes can be estimated using established quantitative accelerated test data analysis methods. Our results suggest that a significant cost and time savings may be realized using the proposed method to signal the need to reassess a product’s design or reallocate test resources to avoid unnecessary maintenance or redesigns. Further, the proposed approach allows a significant reduction in the test time and sample size required to estimate the risk of meeting a reliability requirement over current quantitative accelerated life test techniques. Additional contributions include a numerical and analytical procedure for obtaining the maximum likelihood parameter estimates and observed Fisher information matrix components for the generalized poly-Weibull distribution. Using this procedure, we show that the poly-Weibull distribution outperforms the best-fit modified Weibull alternatives in the literature with respect to their fit of reference data sets for which the hazard rate functions are non-monotone

    A hierarchical Bayesian regression framework for enabling online reliability estimation and condition-based maintenance through accelerated testing

    Get PDF
    Thanks to the advances in the Internet of Things (IoT), Condition-based Maintenance (CBM) has progressively become one of the most renowned strategies to mitigate the risk arising from failures. Within any CBM framework, non-linear correlation among data and variability of condition monitoring data sources are among the main reasons that lead to a complex estimation of Reliability Indicators (RIs). Indeed, most classic approaches fail to fully consider these aspects. This work presents a novel methodology that employs Accelerated Life Testing (ALT) as multiple sources of data to define the impact of relevant PVs on RIs, and subsequently, plan maintenance actions through an online reliability estimation. For this purpose, a Generalized Linear Model (GLM) is exploited to model the relationship between PVs and an RI, while a Hierarchical Bayesian Regression (HBR) is implemented to estimate the parameters of the GLM. The HBR can deal with the aforementioned uncertainties, allowing to get a better explanation of the correlation of PVs. We considered a numerical example that exploits five distinct operating conditions for ALT as a case study. The developed methodology provides asset managers a solid tool to estimate online reliability and plan maintenance actions as soon as a given condition is reached.Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.Ship Design, Production and Operation

    Statistical Degradation Models for Electronics

    Get PDF
    With increasing presence of electronics in modern systems and in every-day products, their reliability is inextricably dependent on that of their electronics. We develop reliability models for failure-time prediction under small failure-time samples and information on individual degradation history. The development of the model extends the work of Whitmore et al. 1998, to incorporate two new data-structures common to reliability testing. Reliability models traditionally use lifetime information to evaluate the reliability of a device or system. To analyze small failure-time samples within dynamic environments where failure mechanisms are unknown, there is a need for models that make use of auxiliary reliability information. In this thesis we present models suitable for reliability data, where degradation variables are latent and can be tracked by related observable variables we call markers. We provide an engineering justification for our model and develop parametric and predictive inference equations for a data-structure that includes terminal observations of the degradation variable and longitudinal marker measurements. We compare maximum likelihood estimation and prediction results obtained by Whitmore et. al. 1998 and show improvement in inference under small sample sizes. We introduce modeling of variable failure thresholds within the framework of bivariate degradation models and discuss ways of incorporating covariates. In the second part of the thesis we investigate anomaly detection through a Bayesian support vector machine and discuss its place in degradation modeling. We compute posterior class probabilities for time-indexed covariate observations, which we use as measures of degradation. Lastly, we present a multistate model used to model a recurrent event process and failure-times. We compute the expected time to failure using counting process theory and investigate the effect of the event process on the expected failure-time estimates

    Two-stage maximum likelihood estimation procedure for parallel constant-stress accelerated degradation tests.

    Get PDF
    [[abstract]]Abstract—The parallel constant-stress accelerated degradation test (PCSADT) is a popular method used to assess the reliability of highly reliable products in a timely manner. Although the maximum likelihood (ML) method is commonly utilized to estimate the PCSADT parameters, the explicit forms of the ML estimators, and their corresponding Fisher information matrix are usually difficult to obtain. In this article, we propose a two-stage ML (TSML) estimation procedure for a time-transformed model. In the proposed procedure, all the TSML estimators not only have explicit expressions but also possess consistency and asymptotic normality. Hence, this method is tractable for reliability engineers. Furthermore, the TSML estimators can provide constructive information about the unknown accelerated relationship law. The proposed method is also applied to analyze light-emitting diode data and compare the performance of our estimation procedures with the ML method via simulations.[[notice]]補正完

    New methods for modeling accelerated life test data

    Get PDF
    An accelerated life test (ALT) is often used to obtain timely information for highly reliable items. The increased use of ALTs has resulted in nontraditional reliability data which can not be analyzed with standard statistical methodologies. I propose new methods for analyzing ALT data for studies with (1) two independent populations, (2) paired samples and (3) limited failure populations (LFP). Here, the Weibull distribution, which can accommodate a variety of failure rates, is assumed for the models I develop. For case (1), a parametric hypothesis test, a Bayesian analysis and a test using partial likelihood are proposed and discussed. For paired samples, I show that there is no exact test for the equality of the survival distributions. Thus, several tests are investigated using a simulation study of their Type I errors. A Bayesian approach that allows for the comparison and estimation of the failure rates is also considered. For computation, Markov Chain Monte Carlo (MCMC) methods are implemented using BUGS. Certain types of devices (such as integrated circuits) that are operated at normal use conditions are at risk of failure because of inherent manufacturing faults (latent risk factors). A small proportion of defective units, p, may fail over time under normal operating conditions. For the non-defective units, the probability of failing under normal conditions during their technological lifetime is zero. Meeker ([29], [31]) called a population of such units a limited failure population (LFP). I propose a new model for LFP in which the number of latent risk factors and the times at which they become fatal depend on the stress level. This model allows for a fraction of the population to be latent risk free. For analyzing this model, I propose a classical as well as a Bayesian approach, which can be very useful when an engineer has expert knowledge of the manufacturing process. In all cases, a real data set is analyzed to demonstrate my procedures

    Data Analysis and Experimental Design for Accelerated Life Testing with Heterogeneous Group Effects

    Get PDF
    abstract: In accelerated life tests (ALTs), complete randomization is hardly achievable because of economic and engineering constraints. Typical experimental protocols such as subsampling or random blocks in ALTs result in a grouped structure, which leads to correlated lifetime observations. In this dissertation, generalized linear mixed model (GLMM) approach is proposed to analyze ALT data and find the optimal ALT design with the consideration of heterogeneous group effects. Two types of ALTs are demonstrated for data analysis. First, constant-stress ALT (CSALT) data with Weibull failure time distribution is modeled by GLMM. The marginal likelihood of observations is approximated by the quadrature rule; and the maximum likelihood (ML) estimation method is applied in iterative fashion to estimate unknown parameters including the variance component of random effect. Secondly, step-stress ALT (SSALT) data with random group effects is analyzed in similar manner but with an assumption of exponentially distributed failure time in each stress step. Two parameter estimation methods, from the frequentist’s and Bayesian points of view, are applied; and they are compared with other traditional models through simulation study and real example of the heterogeneous SSALT data. The proposed random effect model shows superiority in terms of reducing bias and variance in the estimation of life-stress relationship. The GLMM approach is particularly useful for the optimal experimental design of ALT while taking the random group effects into account. In specific, planning ALTs under nested design structure with random test chamber effects are studied. A greedy two-phased approach shows that different test chamber assignments to stress conditions substantially impact on the estimation of unknown parameters. Then, the D-optimal test plan with two test chambers is constructed by applying the quasi-likelihood approach. Lastly, the optimal ALT planning is expanded for the case of multiple sources of random effects so that the crossed design structure is also considered, along with the nested structure.Dissertation/ThesisDoctoral Dissertation Industrial Engineering 201
    • …
    corecore