174,047 research outputs found

    USING THE Q-WEIBULL DISTRIBUTION FOR RELIABILITY ENGINEERING MODELING AND APPLICATIONS

    Get PDF
    Modeling and improving system reliability require selecting appropriate probability distributions for describing the uncertainty in failure times. The q-Weibull distribution, which is based on the Tsallis non-extensive entropy, is a generalization of the Weibull distribution in the context of non-extensive statistical mechanics. The q-Weibull distribution can be used to describe complex systems with long-range interactions and long-term memory, can model various behaviors of the hazard rate, including unimodal, bathtub-shaped, monotonic, and constant, and can reproduce both short and long-tailed distributions. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because parameter estimation is challenging. This research develops and tests an adaptive hybrid artificial bee colony approach for estimating the parameters of a q-Weibull distribution. This research demonstrates that the q-Weibull distribution has a superior performance over Weibull distribution in the characterization of lifetime data with a non-monotonic hazard rate. Moreover, in terms of system reliability, the q-Weibull distribution can model dependent series systems and can be modified to model dependent parallel systems. This research proposes using the q-Weibull distribution to directly model failure time of a series system composed of dependent components that are described by Clayton copula and discusses the connection between the q-Weibull distribution and the Clayton copula and shows the equivalence in their parameters. This dissertation proposes a Nonhomogeneous Poisson Process (NHPP) with a q-Weibull as underlying time to first failure (TTFF) distribution to model the minimal repair process of a series system composed of multiple dependent components. The proposed NHPP q-Weibull model has the advantage of fewer parameters with smaller uncertainty when used as an approximation to the Clayton copula approach, which in turn needs more information on the assumption for the underlying distributions of components and the exact component cause of system failure. This dissertation also proposes a q-Fréchet distribution, dual distribution to q-Weibull distribution, to model a parallel system with dependent component failure times that are modeled as a Clayton copula. The q-Weibull and q-Fréchet distributions are successfully applied to predict series and parallel system failures, respectively, using data that is characterized by non-monotonic hazard rates

    Managing Well Integrity using Reliability Based Models

    Get PDF
    Imperial Users onl

    Techniques for the Fast Simulation of Models of Highly dependable Systems

    Get PDF
    With the ever-increasing complexity and requirements of highly dependable systems, their evaluation during design and operation is becoming more crucial. Realistic models of such systems are often not amenable to analysis using conventional analytic or numerical methods. Therefore, analysts and designers turn to simulation to evaluate these models. However, accurate estimation of dependability measures of these models requires that the simulation frequently observes system failures, which are rare events in highly dependable systems. This renders ordinary Simulation impractical for evaluating such systems. To overcome this problem, simulation techniques based on importance sampling have been developed, and are very effective in certain settings. When importance sampling works well, simulation run lengths can be reduced by several orders of magnitude when estimating transient as well as steady-state dependability measures. This paper reviews some of the importance-sampling techniques that have been developed in recent years to estimate dependability measures efficiently in Markov and nonMarkov models of highly dependable system

    Parameter Estimates of General Failure Rate Model: A Bayesian Approach

    Full text link
    The failure rate function plays an important role in studying the lifetime distributions in reliability theory and life testing models. A study of the general failure rate model r(t)=a+btθ1r(t)=a+bt^{\theta-1}, under squared error loss function taking aa and bb independent exponential random variables has been analyzed in the literature. In this article, we consider aa and bb not necessarily independent. The estimates of the parameters aa and bb under squared error loss, linex loss and entropy loss functions are obtained here

    Estimating rate of occurrence of rare events with empirical Bayes : a railway application

    Get PDF
    Classical approaches to estimating the rate of occurrence of events perform poorly when data are few. Maximum likelihood estimators result in overly optimistic point estimates of zero for situations where there have been no events. Alternative empirical-based approaches have been proposed based on median estimators or non-informative prior distributions. While these alternatives offer an improvement over point estimates of zero, they can be overly conservative. Empirical Bayes procedures offer an unbiased approach through pooling data across different hazards to support stronger statistical inference. This paper considers the application of Empirical Bayes to high consequence low-frequency events, where estimates are required for risk mitigation decision support such as as low as reasonably possible. A summary of empirical Bayes methods is given and the choices of estimation procedures to obtain interval estimates are discussed. The approaches illustrated within the case study are based on the estimation of the rate of occurrence of train derailments within the UK. The usefulness of empirical Bayes within this context is discusse
    corecore