71 research outputs found

    Contributions to accelerated reliability testing

    Get PDF
    A thesis submitted to the Faculty of Science, University of the Witwatersrand, Johannesburg, in fulfilment of the requirements for the degree of Doctor of Philosophy. Johannesburg, December 2014.Industrial units cannot operate without failure forever. When the operation of a unit deviates from industrial standards, it is considered to have failed. The time from the moment a unit enters service until it fails is its lifetime. Within reliability and often in life data analysis in general, lifetime is the event of interest. For highly reliable units, accelerated life testing is required to obtain lifetime data quickly. Accelerated tests where failure is not instantaneous, but the end point of an underlying degradation process are considered. Failure during testing occurs when the performance of the unit falls to some specified threshold value such that the unit fails to meet industrial specifications though it has some residual functionality (degraded failure) or decreases to a critical failure level so that the unit cannot perform its function to any degree (critical failure). This problem formulation satisfies the random signs property, a notable competing risks formulation originally developed in maintenance studies but extended to accelerated testing here. Since degraded and critical failures are linked through the degradation process, the open problem of modelling dependent competing risks is discussed. A copula model is assumed and expert opinion is used to estimate the copula. Observed occurrences of degraded and critical failure times are interpreted as times when the degradation process first crosses failure thresholds and are therefore postulated to be distributed as inverse Gaussian. Based on the estimated copula, a use-level unit lifetime distribution is extrapolated from test data. Reliability metrics from the extrapolated use-level unit lifetime distribution are found to differ slightly with respect to different degrees of stochastic dependence between the risks. Consequently, a degree of dependence between the risks that is believed to be realistic to admit is considered an important factor when estimating the use-level unit lifetime distribution from test data. Keywords: Lifetime; Accelerated testing; Competing risks; Copula; First passage time

    Maximum likelihood estimation of exponential distribution under type-ii censoring from imprecise data

    Get PDF
    Statistical analysis of lifetime distributions under Type-II censoring scheme is based on precise lifetime data. However, some collected lifetime data might be imprecise and are represented in the form of fuzzy numbers. This paper deals with the estimation of exponential mean parameter under Type-II censoring scheme when the lifetime observations are fuzzy and are assumed to be related to underlying crisp realization of a random sample. Maximum likelihood estimate of the unknown parameter is obtained by using EM algorithm. In addition, a new numerical method for parameter estimation is provided. Using the parametric bootstrap method, the construction of confidence intervals for the mean parameter is discussed. Monte Carlo simulations are performed to investigate performance of the different methods. Finally, an illustrative example is also included.Keywords: Type-II censoring, Imprecise lifetime, Exponential distribution, Maximumlikelihood estimation, Bootstrap confidence interva

    One-Shot Device Testing Data Analysis under Logistic-Exponential Lifetimes with an Application to SEER Gallbladder Cancer Data

    Full text link
    In the literature, the reliability analysis of one-shot devices is found under accelerated life testing in the presence of various stress factors. The application of one-shot devices can be extended to the bio-medical field, where we often evidence that inflicted with a certain disease, survival time would be under different stress factors like environmental stress, co-morbidity, the severity of disease etc. This work is concerned with a one-shot device data analysis and applies it to SEER Gallbladder cancer data. The two-parameter logistic exponential distribution is applied as a lifetime distribution. For robust parameter estimation, weighted minimum density power divergence estimators (WMDPDE) is obtained along with the conventional maximum likelihood estimators (MLE). The asymptotic behaviour of the WMDPDE and the robust test statistic based on the density power divergence measure are also studied. The performances of estimators are evaluated through extensive simulation experiments. Later those developments are applied to SEER Gallbladder cancer data. Citing the importance of knowing exactly when to inspect the one-shot devices put to the test, a search for optimum inspection times is performed. This optimization is designed to minimize a defined cost function which strikes a trade-off between the precision of the estimation and experimental cost. The search is accomplished through the population-based heuristic optimization method Genetic Algorithm

    Les modèles de régression dynamique et leurs applications en analyse de survie et fiabilité

    Get PDF
    This thesis was designed to explore the dynamic regression models, assessing the sta-tistical inference for the survival and reliability data analysis. These dynamic regressionmodels that we have been considered including the parametric proportional hazards andaccelerated failure time models contain the possibly time-dependent covariates. We dis-cussed the following problems in this thesis.At first, we presented a generalized chi-squared test statisticsY2nthat is a convenient tofit the survival and reliability data analysis in presence of three cases: complete, censoredand censored with covariates. We described in detail the theory and the mechanism to usedofY2ntest statistic in the survival and reliability data analysis. Next, we considered theflexible parametric models, evaluating the statistical significance of them by usingY2nandlog-likelihood test statistics. These parametric models include the accelerated failure time(AFT) and a proportional hazards (PH) models based on the Hypertabastic distribution.These two models are proposed to investigate the distribution of the survival and reliabilitydata in comparison with some other parametric models. The simulation studies were de-signed, to demonstrate the asymptotically normally distributed of the maximum likelihood estimators of Hypertabastic’s parameter, to validate of the asymptotically property of Y2n test statistic for Hypertabastic distribution when the right censoring probability equal 0% and 20%.n the last chapter, we applied those two parametric models above to three scenes ofthe real-life data. The first one was done the data set given by Freireich et al. on thecomparison of two treatment groups with additional information about log white blood cellcount, to test the ability of a therapy to prolong the remission times of the acute leukemiapatients. It showed that Hypertabastic AFT model is an accurate model for this dataset.The second one was done on the brain tumour study with malignant glioma patients, givenby Sauerbrei & Schumacher. It showed that the best model is Hypertabastic PH onadding five significance covariates. The third application was done on the data set given by Semenova & Bitukov on the survival times of the multiple myeloma patients. We did not propose an exactly model for this dataset. Because of that was an existing oneintersection of survival times. We, therefore, suggest fitting other dynamic model as SimpleCross-Effect model for this dataset.Cette thèse a été conçu pour explorer les modèles dynamiques de régression, d’évaluer les inférences statistiques pour l’analyse des données de survie et de fiabilité. Ces modèles de régression dynamiques que nous avons considérés, y compris le modèle des hasards proportionnels paramétriques et celui de la vie accélérée avec les variables qui peut-être dépendent du temps. Nous avons discuté des problèmes suivants dans cette thèse.Nous avons présenté tout d’abord une statistique de test du chi-deux généraliséeY2nquiest adaptative pour les données de survie et fiabilité en présence de trois cas, complètes,censurées à droite et censurées à droite avec les covariables. Nous avons présenté en détailla forme pratique deY2nstatistique en analyse des données de survie. Ensuite, nous avons considéré deux modèles paramétriques très flexibles, d’évaluer les significations statistiques pour ces modèles proposées en utilisantY2nstatistique. Ces modèles incluent du modèle de vie accélérés (AFT) et celui de hasards proportionnels (PH) basés sur la distribution de Hypertabastic. Ces deux modèles sont proposés pour étudier la distribution de l’analyse de la duré de survie en comparaison avec d’autre modèles paramétriques. Nous avons validé ces modèles paramétriques en utilisantY2n. Les études de simulation ont été conçus.Dans le dernier chapitre, nous avons proposé les applications de ces modèles paramétriques à trois données de bio-médicale. Le premier a été fait les données étendues des temps de rémission des patients de leucémie aiguë qui ont été proposées par Freireich et al. sur la comparaison de deux groupes de traitement avec des informations supplémentaires sur les log du blanc du nombre de globules. Elle a montré que le modèle Hypertabastic AFT est un modèle précis pour ces données. Le second a été fait sur l’étude de tumeur cérébrale avec les patients de gliome malin, ont été proposées par Sauerbrei & Schumacher. Elle a montré que le meilleur modèle est Hypertabastic PH à l’ajout de cinq variables de signification. La troisième demande a été faite sur les données de Semenova & Bitukov, à concernant les patients de myélome multiple. Nous n’avons pas proposé un modèle exactement pour ces données. En raison de cela était les intersections de temps de survie.Par conséquent, nous vous conseillons d’utiliser un autre modèle dynamique que le modèle de la Simple Cross-Effect à installer ces données

    Symmetric and Asymmetric Distributions

    Get PDF
    In recent years, the advances and abilities of computer software have substantially increased the number of scientific publications that seek to introduce new probabilistic modelling frameworks, including continuous and discrete approaches, and univariate and multivariate models. Many of these theoretical and applied statistical works are related to distributions that try to break the symmetry of the normal distribution and other similar symmetric models, mainly using Azzalini's scheme. This strategy uses a symmetric distribution as a baseline case, then an extra parameter is added to the parent model to control the skewness of the new family of probability distributions. The most widespread and popular model is the one based on the normal distribution that produces the skewed normal distribution. In this Special Issue on symmetric and asymmetric distributions, works related to this topic are presented, as well as theoretical and applied proposals that have connections with and implications for this topic. Immediate applications of this line of work include different scenarios such as economics, environmental sciences, biometrics, engineering, health, etc. This Special Issue comprises nine works that follow this methodology derived using a simple process while retaining the rigor that the subject deserves. Readers of this Issue will surely find future lines of work that will enable them to achieve fruitful research results

    Contributions to planning and analysis of accelerated testing

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Optimal Experimental Planning of Reliability Experiments Based on Coherent Systems

    Get PDF
    In industrial engineering and manufacturing, assessing the reliability of a product or system is an important topic. Life-testing and reliability experiments are commonly used reliability assessment methods to gain sound knowledge about product or system lifetime distributions. Usually, a sample of items of interest is subjected to stresses and environmental conditions that characterize the normal operating conditions. During the life-test, successive times to failure are recorded and lifetime data are collected. Life-testing is useful in many industrial environments, including the automobile, materials, telecommunications, and electronics industries. There are different kinds of life-testing experiments that can be applied for different purposes. For instance, accelerated life tests (ALTs) and censored life tests are commonly used to acquire information in reliability and life-testing experiments with the presence of time and resource limitations. Statistical inference based on the data obtained from a life test and effectively planning a life-testing experiment subject to some constraints are two important problems statisticians are interested in. The experimental design problem for a life test has long been studied; however, the experimental planning considering putting the experimental units into systems for a life-test has not been studied. In this thesis, we study the optimal experimental planning problem in multiple stress levels life-testing experiments and progressively Type-II censored life-testing experiments when the test units can be put into coherent systems for the experiment. Based on the notion of system signature, a tool in structure reliability to represent the structure of a coherent system, under different experimental settings, models and assumptions, we derive the maximum likelihood estimators of the model parameters and the expected Fisher information matrix. Then, we use the expected Fisher information matrix to obtain the asymptotic variance-covariance matrix of the maximum likelihood estimators when nn-component coherent systems are used in the life-testing experiment. Based on different optimality criteria, such as DD-optimality, AA-optimality and VV-optimality, we obtain the optimal experimental plans under different settings. Numerical and Monte Carlo simulation studies are used to demonstrate the advantages and disadvantages of using systems in life-testing experiments

    Optimal allocation of simple step-stress model with Weibull distributed lifetimes under type-I censoring.

    Get PDF
    Lo, Kwok Yuen.Thesis (M.Phil.)--Chinese University of Hong Kong, 2010.Includes bibliographical references (leaves 52-53).Abstracts in English and Chinese.Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Background --- p.1Chapter 1.2 --- Scope of the thesis --- p.3Chapter 2 --- Lifetime Model --- p.4Chapter 2.1 --- Introduction --- p.4Chapter 2.2 --- Weibull Distribution --- p.4Chapter 2.3 --- Step-Stress Experiment --- p.5Chapter 3 --- Maximum Likelihood Estimation of Model Parameters --- p.9Chapter 3.1 --- Introduction --- p.9Chapter 3.2 --- Maximum Likelihood Estimation --- p.10Chapter 3.3 --- Fisher Information Matrix --- p.13Chapter 3.4 --- Numerical Methods improving Newton's method. --- p.17Chapter 3.4.1 --- Initial values --- p.18Chapter 3.4.2 --- Fisher-Scoring method --- p.19Chapter 4 --- Optimal Experimental Design --- p.21Chapter 4.1 --- Introduction --- p.21Chapter 4.2 --- Optimal Criteria --- p.22Chapter 4.3 --- Optimal Stress-changing-time Proportion --- p.23Chapter 4.3.1 --- Optimal n versus the shape parameter B --- p.24Chapter 4.3.2 --- "Optimal n versus the parameters ao, a1" --- p.27Chapter 4.3.3 --- Optimal n versus the initial stress level x1 --- p.32Chapter 4.3.4 --- Optimal n versus the censoring time t2 --- p.33Chapter 4.4 --- Sensitivity Analysis --- p.34Chapter 4.4.1 --- Effects of the shape parameter B --- p.34Chapter 4.4.2 --- "Effects of the parameters ao, al" --- p.37Chapter 5 --- Conclusion Remarks and Further Research --- p.39Chapter A --- Simulation Algorithm for a Weibull Type-I Censored Simple Step-Stress Model --- p.41Chapter B --- Expected values of Fisher Information Matrix --- p.42Chapter C --- "Derivation of P(A1, A2)" --- p.50Bibliography --- p.5
    corecore