2,074 research outputs found

    Analysis of Generalized Inverted Exponential Distribution under Adaptive Type-I Progressive Hybrid Censored Competing Risks Data

    Get PDF
    The estimation of the unknown parameters of generalized inverted exponential distribution under adaptive type-I progressive hybrid censored scheme (AT-I PHCS) with competing risks data will be discussed. The reason why AT-I PHCS has exceeded other failure censored types; Time censored types enable analysts to accomplish their trials and experiments in a shorter time and with higher efficiency. In this regards, we obtain the maximum likelihood estimation of the parameters and the asymptotic confidence intervals for the unknown parameters. Further, Bayes estimates of the parameters which obtained based on squared error and LINEX loss functions under the assumptions of independent gamma priors of the scale parameters. For Bayesian estimation, we take advantage of Markov Chain Monte Carlo techniques to derive Bayesian estimators and the credible intervals. Finally, two data sets with Monte Carlo simulation study and a real data set are analyzed for illustrative purposes

    Statistical inference for dependent competing risks data under adaptive Type-II progressive hybrid censoring

    Full text link
    In this article, we consider statistical inference based on dependent competing risks data from Marshall-Olkin bivariate Weibull distribution. The maximum likelihood estimates of the unknown model parameters have been computed by using the Newton-Raphson method under adaptive Type II progressive hybrid censoring with partially observed failure causes. The existence and uniqueness of maximum likelihood estimates are derived. Approximate confidence intervals have been constructed via the observed Fisher information matrix using the asymptotic normality property of the maximum likelihood estimates. Bayes estimates and highest posterior density credible intervals have been calculated under gamma-Dirichlet prior distribution by using the Markov chain Monte Carlo technique. Convergence of Markov chain Monte Carlo samples is tested. In addition, a Monte Carlo simulation is carried out to compare the effectiveness of the proposed methods. Further, three different optimality criteria have been taken into account to obtain the most effective censoring plans. Finally, a real-life data set has been analyzed to illustrate the operability and applicability of the proposed methods

    Inferencia estadística robusta basada en divergencias para dispositivos de un sólo uso

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Ciencias Matemáticas, leída el 30-06-2021A one-shot device is a unit that performs its function only once and, after use, the device either gets destroyed or must be rebuilt. For this kind of device, one can only know whether the failure time is either before or after a speci c inspection time, and consequently the lifetimes are either left- or right-censored, with the lifetime being less than the inspection time if the test outcome is a failure (resulting in left censoring) and the lifetime being more than the inspection time if the test outcome is a success (resulting in right censoring). An accelerated life test (ALT) plan is usually employed to evaluate the reliability of such products by increasing the levels of stress factors and then extrapolating the life characteristics from high stress conditions to normal operating conditions. This acceleration process will shorten the life span of devices and reduce the costs associated with the experiment. The study of one-shot device from ALT data has been developed considerably recently, mainly motivated by the work of Fan et al. [2009]...Los dispositivos de un solo uso (one shot devices en ingles), son aquellos que, una vez usados, dejan de funcionar. La mayor dificultad a la hora de modelizar su tiempo de vida es que solo se puede saber si el momento de fallo se produce antes o despues de un momento específico de inspeccion. As pues, se trata de un caso extremo de censura intervalica: si el tiempo de vida es inferior al de inspeccion observaremos un fallo (censura por la izquierda), mientras que si el tiempo de vida es mayor que el tiempo de inspeccion, observaremos un exito (censura por la derecha). Para la observacion y modelizacion de este tipo de dispositivos es comun el uso de tests de vida acelerados. Los tests de vida acelerados permiten evaluar la fiabilidad de los productos en menos tiempo, incrementando las condiciones a las que se ven sometidos los dispositivos para extrapolar despues estos resultados a condiciones mas normales. El estudio de los dispositivos de un solo uso por medio de tests de vida acelerados se ha incrementado considerablemente en los ultimos a~nos motivado, principalmente, por el trabajo de Fan et al. [2009]...Fac. de Ciencias MatemáticasTRUEunpu

    A RISK-INFORMED DECISION-MAKING METHODOLOGY TO IMPROVE LIQUID ROCKET ENGINE PROGRAM TRADEOFFS

    Get PDF
    This work provides a risk-informed decision-making methodology to improve liquid rocket engine program tradeoffs with the conflicting areas of concern affordability, reliability, and initial operational capability (IOC) by taking into account psychological and economic theories in combination with reliability engineering. Technical program risks are associated with the number of predicted failures of the test-analyze-and-fix (TAAF) cycle that is based on the maturity of the engine components. Financial and schedule program risks are associated with the epistemic uncertainty of the models that determine the measures of effectiveness in the three areas of concern. The affordability and IOC models' inputs reflect non-technical and technical factors such as team experience, design scope, technology readiness level, and manufacturing readiness level. The reliability model introduces the Reliability- As-an-Independent-Variable (RAIV) strategy that aggregates fictitious or actual hotfire tests of testing profiles that differ from the actual mission profile to estimate the system reliability. The main RAIV strategy inputs are the physical or functional architecture of the system, the principal test plan strategy, a stated reliability-bycredibility requirement, and the failure mechanisms that define the reliable life of the system components. The results of the RAIV strategy, which are the number of hardware sets and number of hot-fire tests, are used as inputs to the affordability and the IOC models. Satisficing within each tradeoff is attained by maximizing the weighted sum of the normalized areas of concern subject to constraints that are based on the decision-maker's targets and uncertainty about the affordability, reliability, and IOC using genetic algorithms. In the planning stage of an engine program, the decision variables of the genetic algorithm correspond to fictitious hot-fire tests that include TAAF cycle failures. In the program execution stage, the RAIV strategy is used as reliability growth planning, tracking, and projection model. The main contributions of this work are the development of a comprehensible and consistent risk-informed tradeoff framework, the RAIV strategy that links affordability and reliability, a strategy to define an industry or government standard or guideline for liquid rocket engine hot-fire test plans, and an alternative to the U.S. Crow/AMSAA reliability growth model applying the RAIV strategy

    Determinants of Bank Efficiency: the case of Brazil

    Get PDF
    This paper analyzes the efficiency of the Brazilian banking sector over the post-privatization period of 2000-2007. We employ a Bayesian stochastic frontier approach, which provides exact efficiency estimates and confidence intervals and thus, allows an accurate comparison across institutions and bank groups. The results suggest that large banks are the most cost and profit efficient, supporting the concentration process observed in recent years. Foreign banks have achieved a good performance through either the establishment of new affiliates and the acquisition of local banks. The remaining public banks have had improvements in cost efficiency, but are relatively profit inefficient. Finally, we observe a positive impact of capitalization on efficiency.

    An investigation of estimation performance for a multivariate Poisson-gamma model with parameter dependency

    Get PDF
    Statistical analysis can be overly reliant on naive assumptions of independence between different data generating processes. This results in having greater uncertainty when estimating underlying characteristics of processes as dependency creates an opportunity to boost sample size by incorporating more data into the analysis. However, this assumes that dependency has been appropriately specified, as mis-specified dependency can provide misleading information from the data. The main aim of this research is to investigate the impact of incorporating dependency into the data analysis. Our motivation for this work is concerned with estimating the reliability of items and as such we have restricted our investigation to study homogeneous Poisson processes (HPP), which can be used to model the rate of occurrence of events such as failures. In an HPP, dependency between rates can occur for numerous reasons. Whether it is similarity in mechanical designs, failure occurrence due to a common management culture or comparable failure count across machines for same failure modes. Multiple types of dependencies are considered. Dependencies can take different forms, such as simple linear dependency measured through the Pearson correlation, rank dependencies which capture non-linear dependencies and tail dependencies where the strength of the dependency may be stronger in extreme events as compared to more moderate one. The estimation of the measure of dependency between correlated processes can be challenging. We develop the research grounded in a Bayes or empirical Bayes inferential framework, where uncertainty in the actual rate of occurrence of a process is modelled with a prior probability distribution. We consider prior distributions to belong to the Gamma distribution given its flexibility and mathematical association with the Poisson process. For dependency modelling between processes we consider copulas which are a convenient and flexible way of capturing a variety of different dependency characteristics between distributions. We use a multivariate Poisson – Gamma probability model. The Poisson process captures aleatory uncertainty, the inherent variability in the data. Whereas the Gamma prior describes the epistemic uncertainty. By pooling processes with correlated underlying mean rate we are able to incorporate data from these processes into the inferential process and reduce the estimation error. There are three key research themes investigated in this thesis. First, to investigate the value in reducing estimation error by incorporating dependency within the analysis via theoretical analysis and simulation experiments. We show that correctly accounting for dependency can significantly reduce the estimation error. The findings should inform analysts a priori as to whether it is worth pursuing a more complex analysis for which the dependency parameter needs to be elicited. Second, to examine the consequences of mis-specifying the degree and form of dependency through controlled simulation experiments. We show the relative robustness of different ways of modelling the dependency using copula and Bayesian methods. The findings should inform analysts about the sensitivity of modelling choices. Third, to show how we can operationalise different methods for representing dependency through an industry case study. We show the consequences for a simple decision problem associated with the provision of spare parts to maintain operation of the industry process when depenency between event rates of the machines is appropriately modelled rather than being treated as independent processes.Statistical analysis can be overly reliant on naive assumptions of independence between different data generating processes. This results in having greater uncertainty when estimating underlying characteristics of processes as dependency creates an opportunity to boost sample size by incorporating more data into the analysis. However, this assumes that dependency has been appropriately specified, as mis-specified dependency can provide misleading information from the data. The main aim of this research is to investigate the impact of incorporating dependency into the data analysis. Our motivation for this work is concerned with estimating the reliability of items and as such we have restricted our investigation to study homogeneous Poisson processes (HPP), which can be used to model the rate of occurrence of events such as failures. In an HPP, dependency between rates can occur for numerous reasons. Whether it is similarity in mechanical designs, failure occurrence due to a common management culture or comparable failure count across machines for same failure modes. Multiple types of dependencies are considered. Dependencies can take different forms, such as simple linear dependency measured through the Pearson correlation, rank dependencies which capture non-linear dependencies and tail dependencies where the strength of the dependency may be stronger in extreme events as compared to more moderate one. The estimation of the measure of dependency between correlated processes can be challenging. We develop the research grounded in a Bayes or empirical Bayes inferential framework, where uncertainty in the actual rate of occurrence of a process is modelled with a prior probability distribution. We consider prior distributions to belong to the Gamma distribution given its flexibility and mathematical association with the Poisson process. For dependency modelling between processes we consider copulas which are a convenient and flexible way of capturing a variety of different dependency characteristics between distributions. We use a multivariate Poisson – Gamma probability model. The Poisson process captures aleatory uncertainty, the inherent variability in the data. Whereas the Gamma prior describes the epistemic uncertainty. By pooling processes with correlated underlying mean rate we are able to incorporate data from these processes into the inferential process and reduce the estimation error. There are three key research themes investigated in this thesis. First, to investigate the value in reducing estimation error by incorporating dependency within the analysis via theoretical analysis and simulation experiments. We show that correctly accounting for dependency can significantly reduce the estimation error. The findings should inform analysts a priori as to whether it is worth pursuing a more complex analysis for which the dependency parameter needs to be elicited. Second, to examine the consequences of mis-specifying the degree and form of dependency through controlled simulation experiments. We show the relative robustness of different ways of modelling the dependency using copula and Bayesian methods. The findings should inform analysts about the sensitivity of modelling choices. Third, to show how we can operationalise different methods for representing dependency through an industry case study. We show the consequences for a simple decision problem associated with the provision of spare parts to maintain operation of the industry process when depenency between event rates of the machines is appropriately modelled rather than being treated as independent processes

    A hierarchical Bayesian regression framework for enabling online reliability estimation and condition-based maintenance through accelerated testing

    Get PDF
    Thanks to the advances in the Internet of Things (IoT), Condition-based Maintenance (CBM) has progressively become one of the most renowned strategies to mitigate the risk arising from failures. Within any CBM framework, non-linear correlation among data and variability of condition monitoring data sources are among the main reasons that lead to a complex estimation of Reliability Indicators (RIs). Indeed, most classic approaches fail to fully consider these aspects. This work presents a novel methodology that employs Accelerated Life Testing (ALT) as multiple sources of data to define the impact of relevant PVs on RIs, and subsequently, plan maintenance actions through an online reliability estimation. For this purpose, a Generalized Linear Model (GLM) is exploited to model the relationship between PVs and an RI, while a Hierarchical Bayesian Regression (HBR) is implemented to estimate the parameters of the GLM. The HBR can deal with the aforementioned uncertainties, allowing to get a better explanation of the correlation of PVs. We considered a numerical example that exploits five distinct operating conditions for ALT as a case study. The developed methodology provides asset managers a solid tool to estimate online reliability and plan maintenance actions as soon as a given condition is reached.Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.Ship Design, Production and Operation
    corecore