8,788 research outputs found

    Component Reliability Estimation From Partially Masked and Censored System Life Data Under Competing Risks.

    Get PDF
    This research presents new approaches to the estimation of component reliability distribution parameters from partially masked and/or censored system life data. Such data are common in continuous production environments. The methods were tested on Monte Carlo simulated data and compared to the only alternative suggested in literature. This alternative did not converge on many masked datasets. The new methods produce accurate parameter estimates, particularly at low masking levels. They show little bias. One method ignores masked data and treats them as censored observations. It works well if at least 2 known-cause failures of each component type have been observed and is particularly useful for analysis of any size datasets with a small fraction of masked observations. It provides quick and accurate estimates. A second method performs well when the number of masked observations is small but forms a significant portion of the dataset and/or when the assumption of independent masking does not hold. The third method provides accurate estimates when the dataset is small but contains a large fraction of masked observations and when independent masking is assumed. The latter two methods provide an indication which component most likely caused each masked system failure, albeit at the price of much computation time. The methods were implemented in user-friendly software that can be used to apply the method on simulated or real-life data. An application of the methods to real-life industrial data is presented. This research shows that masked system life data can be used effectively to estimate component life distribution parameters in a situation where such data form a large portion of the dataset and few known failures exist. It also demonstrates that a small fraction of masked data in a dataset can safely be treated as censored observations without much effect on the accuracy of the resulting estimates. These results are important as masked system life data are becoming more prevalent in industrial production environments. The research results are gauged to be useful in continuous manufacturing environments, e.g. in the petrochemical industry. They will also likely interest the electronics and automotive industry where masked observations are common

    New Developments in Planning Accelerated Life Tests

    Get PDF
    Accelerated life tests (ALTs) are often used to make timely assessments of the life time distribution of materials and components. The goal of many ALTs is estimation of a quantile of a log-location failure time distribution. Much of the previous work on planning accelerated life tests has focused on deriving test-planning methods under a specific log-location distribution. This thesis presents a new approach for computing approximate large-sample variances of maximum likelihood estimators of a quantile of general log-location distribution with censoring and time-varying stress based on a cumulative exposure model. This thesis also presents a strategy to develop useful test plans using a small number of test units. We provide an approach to find optimum step-stress accelerated life test plans by using the large-sample approximate variance of the maximum likelihood estimator of a quantile of the failure time distribution at use conditions from a step-stress accelerated life test. In Chapter 2, we show this approach allows for multi-step stress changes and censoring for general log-location-scale distributions. As an application of this approach, the optimum variance is studied as a function of shape parameter for both Weibull and lognormal distributions. Graphical comparisons among test plans using step-up, step-down, and constant-stress patterns are also presented. The results show that, depending on the values of the model parameters and quantile of interest, each of the three test plans can be preferable in terms of optimum variance. In Chapter 3, using sample data from a published paper describing optimum ramp-stress test plans, we show that our approach and the one used in the previous work give the same variance-covariance matrix of the quantile estimator from the two different approaches. Then, as an application of this approach, we extend the previous work to a new optimum ramp-stress test plan obtained by simultaneously adjusting the ramp rate and the lower start level of stress. We find that the new optimum test plan can have smaller variances than that of the optimum ramp-stress test plan previously obtained by adjusting only the ramp rate. We also compare optimum ramp-stress test plans with the more commonly used constant-stress accelerated life test plans. Previous work on planning accelerated life tests has been based on large-sample approximations to evaluate test plan properties. In Chapter 4, we use more accurate simulation methods to investigate the properties of accelerated life tests with small sample sizes where large-sample approximations might not be expected to be adequate. These properties include the simulated bias and variance for quantiles of the failure-time distribution at use conditions. We focus on using these methods to find practical compromise test plans that use three levels of stress. We also study the effects of not having any failures at test conditions and the effect of using incorrect planning values. We note that the large-sample approximate variance is far from adequate when the probability of zero failures at certain test conditions is not negligible. We suggest a strategy to develop useful test plans using a small number of test units while meeting constraints on the estimation precision and on the probability that there will be zero failures at one or more of the test stress levels

    Weighted Rank Regression with Dummy Variables for Analyzing Accelerated Life Testing Data

    Get PDF
    In this article, we propose a new rank regression model to extrapolate the product lifetimes at normal operation environment from accelerated testing data. Weighted least squares method is used to compensate for nonconstant error variance in the regression model. A group of dummy variables is incorporated to check model adequacy. We also developed a customizing software for quick-and-easy implementation of the method so that reliability engineers can easily exploit it. Simulation studies show that, under light censoring, the proposed method performs comparatively well in predicting the lifetimes even with small sample sizes. With its computational ease and graphical presentation, the proposed method is expected to be more popular among reliability engineers

    Strategy for Planning Accelerated Life Tests with Small Sample Sizes

    Get PDF
    Previous work on planning accelerated life tests has been based on large-sample approximations to evaluate test plan properties. In this paper, we use more accurate simulation methods to investigate the properties of accelerated life tests with small sample sizes where large-sample approximations might not be expected to be adequate. These properties include the simulated s-bias and variance for quantiles of the failure-time distribution at use conditions. We focus on using these methods to find practical compromise test plans that use three levels of stress. We also study the effects of not having any failures at test conditions and the effect of using incorrect planning values. We note that the large-sample approximate variance is far from adequate when the probability of zero failures at certain test conditions is not negligible. We suggest a strategy to develop useful test plans using a small number of test units while meeting constraints on the estimation precision and on the probability that there will be zero failures at one or more of the test stress levels

    Nonparametric Estimation of a Distribution Subject to a Stochastic Precedence Constraint

    Get PDF
    For any two random variables X and Y with distributions F and G defined on [0,∞), X is said to stochastically precede Y if P(X≤Y) ≥ 1/2. For independent X and Y, stochastic precedence (denoted by X≤spY) is equivalent to E[G(X–)] ≤ 1/2. The applicability of stochastic precedence in various statistical contexts, including reliability modeling, tests for distributional equality versus various alternatives, and the relative performance of comparable tolerance bounds, is discussed. The problem of estimating the underlying distribution(s) of experimental data under the assumption that they obey a stochastic precedence (sp) constraint is treated in detail. Two estimation approaches, one based on data shrinkage and the other involving data translation, are used to construct estimators that conform to the sp constraint, and each is shown to lead to a root n-consistent estimator of the underlying distribution. The asymptotic behavior of each of the estimators is fully characterized. Conditions are given under which each estimator is asymptotically equivalent to the corresponding empirical distribution function or, in the case of right censoring, the Kaplan–Meier estimator. In the complementary cases, evidence is presented, both analytically and via simulation, demonstrating that the new estimators tend to outperform the empirical distribution function when sample sizes are sufficiently large

    Optimum multiple- and single-stress accelerated life tests

    Get PDF
    We consider the optimal design of Accelerated Life Tests (ALT) with Type II censored data. It is assumed that the time-to-failure, or a transformation of it, follows a location scale Gumbel distribution. The location parameter of this distribution is assumed to be of the form;(DIAGRAM, TABLE OR GRAPHIC OMITTED...PLEASE SEE DAI);where the (beta)(,j)\u27s are unknown and the f(,j)\u27s are functions of the stresses x. The scale parameter, (sigma), of the distribution is assumed to be independent of the stresses x;We estimate the parameters in the model using a linear model inwhich the dependent variables are observed order statistics at thepoints of the design. We give general formulae for the Best LinearUnbiased Estimator (BLUE), Y(,p), of the 100pth percentile, Y(,p), of the(\u27 )Gumbel distribution at the design stress x(,D). Also, a general expression for Var(Y(,p)) is derived.(\u27 );For designs in k+1 points, we use asymptotic theory to express^Var(Y(,p)) as a function of the number of allocated units and the(\u27 )^proportion of censoring at each of the k+1 stress levels in the^design. The primary objective is to find designs that minimize^Var(Y(,p)). However, this minimization is complicated by the lack of(\u27 )^closed form expressions for the expected values of the order^statistics and the variance-covariance matrix of the error term in^the linear model. Thus, we propose to restrict the optimization to^designs that minimize one of the components of Var(Y(,p)); in practical(\u27 )situations, this component is usually the dominant term of Var(Y(,p)).(\u27 )We prove that the proposed minimization is equivalent to a problemof optimal extrapolation under a linear model with uncorrelated errors;In the case of a single accelerating stress, we present a characterization of optimal designs in k+1 points when the functions f(,0),...,f(,k) are a T-system in a finite interval a,b;In the case of two or more accelerating stresses and uniform proportion of censoring at each point in the design, we present optimal designs for diverse forms of the regression function;(DIAGRAM, TABLE OR GRAPHIC OMITTED...PLEASE SEE DAI);In particular, these forms include: regression in two variables with first order terms and the cross product, polynomials in r dimension of degree less than or equal to s, and polynomials in three variables with first order terms and cross products of second order

    Probabilistic load flow in systems with high wind power penetration

    Get PDF
    This paper proposes a method for solving a probabilistic load flows that takes into account the uncertainties of wind generation, but also of load and conventional systems. The method uses a combination of methods including cumulant, point estimate and convolution. Cornish Fisher expansion series are also used to find the CDF. The method is of especial application to estimate active power flows through lines

    Class modelling by soft independent modelling of class analogy: why, when, how? A tutorial

    Get PDF
    This article contains a comprehensive tutorial on classification by means of Soft Independent Modelling of Class Analogy (SIMCA). Such a tutorial was conceived in an attempt to offer pragmatic guidelines for a sensible and correct utilisation of this tool as well as answers to three basic questions: “why employing SIMCA?”, “when employing SIMCA?” and “how employing/not employing SIMCA?”. With this purpose in mind, the following points are here addressed: i) the mathematical and statistical fundamentals of the SIMCA approach are presented; ii) distinct variants of the original SIMCA algorithm are thoroughly described and compared in two different case-studies; iii) a flowchart outlining how to fine-tune the parameters of a SIMCA model for achieving an optimal performance is provided; iv) figures of merit and graphical tools for SIMCA model assessment are illustrated and v) computational details and rational suggestions about SIMCA model validation are given. Moreover, a novel Matlab toolbox, which encompasses routines and functions for running and contrasting all the aforementioned SIMCA versions is also made available

    Statistical Analysis of Composite Spectra

    Full text link
    We consider nearest neighbor spacing distributions of composite ensembles of levels. These are obtained by combining independently unfolded sequences of levels containing only few levels each. Two problems arise in the spectral analysis of such data. One problem lies in fitting the nearest neighbor spacing distribution to the histogram of level spacings obtained from the data. We show that the method of Bayesian inference is superior to this procedure. The second problem occurs when one unfolds such short sequences. We show that the unfolding procedure generically leads to an overestimate of the chaoticity parameter. This trend is absent in the presence of long-range level correlations. Thus, composite ensembles of levels from a system with long-range spectral stiffness yield reliable information about the chaotic behavior of the system.Comment: 26 pages, 3 figures; v3: changed conclusions, appendix adde
    • …
    corecore