213,930 research outputs found

    Stochastic RUL calculation enhanced with TDNN-based IGBT failure modeling

    Get PDF
    Power electronics are widely used in the transport and energy sectors. Hence, the reliability of these power electronic components is critical to reducing the maintenance cost of these assets. It is vital that the health of these components is monitored for increasing the safety and availability of a system. The aim of this paper is to develop a prognostic technique for estimating the remaining useful life (RUL) of power electronic components. There is a need for an efficient prognostic algorithm that is embeddable and able to support on-board real-time decision-making. A time delay neural network (TDNN) is used in the development of failure modes for an insulated gate bipolar transistor (IGBT). Initially, the time delay neural network is constructed from training IGBTs' ageing samples. A stochastic process is performed for the estimation results to compute the probability of the health state during the degradation process. The proposed TDNN fusion with a statistical approach benefits the probability distribution function by improving the accuracy of the results of the TDDN in RUL prediction. The RUL (i.e., mean and confidence bounds) is then calculated from the simulation of the estimated degradation states. The prognostic results are evaluated using root mean square error (RMSE) and relative accuracy (RA) prognostic evaluation metrics

    Computer Simulation of Current Forces on Motion of Floating Production Storage and Offloading in Irregular Waves

    Get PDF
    This paper presents the effect of current forces on the motion of forces on Floating Production Storage and Offloading (FPSO) in irregular waves. The objective of this research is to compute the motion of FPSO in irregular waves by time domain simulation including the effect of current forces. A study is made on the slowly varying oscillations of a moored single body system in a current and waves. Linear potential theory is used to describe the fluid motion, and three-dimensional source distribution techniques are applied to obtain the hydrodynamic forces and transfer function of the wave exciting forces. OCIMF (1994) data are used for estimation of the current forces. The non-linear time domain simulations have been carried out in irregular waves. Based on it, slowly varying motion responses are examined including the effect of the current forces. Several environmental conditions, such as the current angle of attack, current velocity, significant wave height and mean wave period are considered, which may significantly affect FPSO motion in surge, sway and yaw moments. It is found that the effect of current forces is quite significant when the current velocity is increased. In this simulation, while the current velocity is increased to 3.0 meter/seconds, the impact on FPSO motion is quite significant, which should be taken into consideration from the point of view of safety, failure of mooring systems, operating responses and the dynamic positioning of the FPSO

    Regression analysis of correlated interval-censored failure time data with a cured subgroup

    Get PDF
    Interval-censored failure time data commonly occur in many periodic follow-up studies such as epidemiological experiments, medical studies and clinical trials. By intervalcensored data, we usually mean that one cannot observe the failure time of interest and instead we know that it belongs to a time interval. Correlated failure time data commonly occur when there are multiple events on one individual or when the study subjects are clustered into some small groups. In this situation, study subjects from same subgroup or the failure events from same individuals are usually regarded as dependent, but the subjects in different clusters or failure events from different individuals are assumed to be independent. Besides the correlation between the cluster, sometimes the cluster size may be informative or carry some information about the failure time of interest. Cured subgroup is another interesting topic that has been discussed by many authors. For this situation, unlike the assumptions in traditional survival model that all study subjects would experience the failure event of interest eventually if the follow-up time is long enough, some subjects may never experience or not be susceptible to the event. Such subjects are treated as cured and assumed to belong to a cured subgroup in a study population. The research in this dissertation focuses on regression analysis of correlated intervalcensored data with a cured subgroup via different approaches based on different data structures. In the first part of this dissertation, we discuss clustered interval-censored data with a cured subgroup and informative cluster size. To address this, we present a within-cluster resampling method and in the approach, the multiple imputation procedure is applied for estimation of unknown parameters. To assess the performance of the proposed method, a simulation study is conducted and suggests that it works well in practical situations. Also, the method is applied to a set of real data that motivated this study. In the second part of this dissertation, we consider the clustered interval-censored data with a cured subgroup via a non-mixture cure model. We present a maximum likelihood estimation procedure under the semiparametric transformation nonmixture cure model. To estimate the unknown parameters, an expectation maximization (EM) algorithm based on an augmentation of Poisson variable is developed. To assess the performance of the proposed method, a simulation study is conducted and suggests that it works well in practical situations. An application to a study conducted by the National Aeronautics and Space Administration that motivated this study is also provided. In the third part of this dissertation, we investigate the bivariate interval-censored data with a cured subgroup. A sieve maximum likelihood estimation procedure under the semiparametric transformation non-mixture cure model based on Bernstein polynomials is presented. A simulation study is conducted to assess the finite sample performance of the proposed method and suggests that the proposed model works well. Also, a real data application from the study of AIDS Clinical Trial Group 181 is provided

    Integrated survival analysis using an event-time approach in a Bayesian framework

    Get PDF
    Event-time or continuous-time statistical approaches have been applied throughout the bio-statistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. this has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed in integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. this provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest \u3c5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data

    Estimation of peak outflow in dam failure using neural network approach under uncertainty analysis

    Get PDF
    This paper presents two Artificial Neural Network (ANN) based models for the prediction of peak outflow from breached embankment dams using two effective parameters including height and volume of water behind the dam at the time of failure. Estimation of optimal weights and biases in the training phase of the ANN is analysed by two different algorithms including Levenberg—Marquardt (LM) as a standard technique used to solve nonlinear least squares problems and Imperialist Competitive Algorithm (ICA) as a new evolutionary algorithm in the evolutionary computation field. Comparison of the obtained results with those of the conventional approach based on regression analysis shows a better performance of the ANN model trained with ICA. Investigation on the uncertainty band of the models indicated that LM predictions have the least uncertainty band whilst ICA’s have the lowest mean prediction error. More analysis on the models’ uncertainty is conducted by a Monte Carlo simulation in which 1000 randomly generated sets of input data are sampled from the database of historical dam failures. The result of 1000 ANN models which have been analysed with three statistical measures including p-factor, d-factor, and DDR confirms that LM predictions have more limited uncertainty band

    Extensions to the Regression Discontinuity Design with Applications in Biostatistics

    Get PDF
    The regression discontinuity (RD) design is a method for estimating a treatment effect in an observational study where there is a treatment allocation guideline that can be linked to the value of a continuous assignment variable and a pre-determined threshold. Typically, treatment is offered to patients whose assignment variable values lie above (or below) the threshold. Patients whose assignment variable values lie close to the threshold can be seen as exchangeable and typically, treatment effect estimation in an RD design involves comparing patients above and below the threshold. For a continuous outcome, estimating a treatment effect usually entails fitting local linear regression models for patients above and below the threshold. We propose the use of the thin plate regression spline to fit flexible regression models for patients above and below the threshold. Limited research has been done on an RD design for binary and time-to-event outcomes. For the binary outcome, we focused on the estimation of the risk ratio. The Wald and multiplicative structural mean models are approaches for estimating the risk ratio that can be applied to an RD design, however, they require additional assumptions. In this thesis, we have proposed an alternative approach for the estimation of the risk ratio that is based on the assumptions of the RD design. For the time-to-event outcome, the accelerated failure time (AFT) model was considered because it has some desirable properties in terms of interpreting causal effects. We propose an estimator of the acceleration factor that is based on the assumptions of an RD design. In addition to this, the structural AFT, a common approach for estimating the acceleration factor in observation studies, was discussed. Simulation studies were carried out to compare the proposed approaches with the existing ones, the results show that the proposed approaches compete favourably with, and in some cases, perform better than the existing methods. In addition, we have provided Bayesian alternatives to the three proposed approaches. Finally, we demonstrated these methods by applying them to real datasets on statin and metformin prescriptions

    Techniques for the Fast Simulation of Models of Highly dependable Systems

    Get PDF
    With the ever-increasing complexity and requirements of highly dependable systems, their evaluation during design and operation is becoming more crucial. Realistic models of such systems are often not amenable to analysis using conventional analytic or numerical methods. Therefore, analysts and designers turn to simulation to evaluate these models. However, accurate estimation of dependability measures of these models requires that the simulation frequently observes system failures, which are rare events in highly dependable systems. This renders ordinary Simulation impractical for evaluating such systems. To overcome this problem, simulation techniques based on importance sampling have been developed, and are very effective in certain settings. When importance sampling works well, simulation run lengths can be reduced by several orders of magnitude when estimating transient as well as steady-state dependability measures. This paper reviews some of the importance-sampling techniques that have been developed in recent years to estimate dependability measures efficiently in Markov and nonMarkov models of highly dependable system

    Bayesian subset simulation

    Full text link
    We consider the problem of estimating a probability of failure α\alpha, defined as the volume of the excursion set of a function f:X⊆Rd→Rf:\mathbb{X} \subseteq \mathbb{R}^{d} \to \mathbb{R} above a given threshold, under a given probability measure on X\mathbb{X}. In this article, we combine the popular subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our sequential Bayesian approach for the estimation of a probability of failure (Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it possible to estimate α\alpha when the number of evaluations of ff is very limited and α\alpha is very small. The resulting algorithm is called Bayesian subset simulation (BSS). A key idea, as in the subset simulation algorithm, is to estimate the probabilities of a sequence of excursion sets of ff above intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A Gaussian process prior on ff is used to define the sequence of densities targeted by the SMC algorithm, and drive the selection of evaluation points of ff to estimate the intermediate probabilities. Adaptive procedures are proposed to determine the intermediate thresholds and the number of evaluations to be carried out at each stage of the algorithm. Numerical experiments illustrate that BSS achieves significant savings in the number of function evaluations with respect to other Monte Carlo approaches
    • …
    corecore