1,067 research outputs found

    Statistical Degradation Models for Electronics

    Get PDF
    With increasing presence of electronics in modern systems and in every-day products, their reliability is inextricably dependent on that of their electronics. We develop reliability models for failure-time prediction under small failure-time samples and information on individual degradation history. The development of the model extends the work of Whitmore et al. 1998, to incorporate two new data-structures common to reliability testing. Reliability models traditionally use lifetime information to evaluate the reliability of a device or system. To analyze small failure-time samples within dynamic environments where failure mechanisms are unknown, there is a need for models that make use of auxiliary reliability information. In this thesis we present models suitable for reliability data, where degradation variables are latent and can be tracked by related observable variables we call markers. We provide an engineering justification for our model and develop parametric and predictive inference equations for a data-structure that includes terminal observations of the degradation variable and longitudinal marker measurements. We compare maximum likelihood estimation and prediction results obtained by Whitmore et. al. 1998 and show improvement in inference under small sample sizes. We introduce modeling of variable failure thresholds within the framework of bivariate degradation models and discuss ways of incorporating covariates. In the second part of the thesis we investigate anomaly detection through a Bayesian support vector machine and discuss its place in degradation modeling. We compute posterior class probabilities for time-indexed covariate observations, which we use as measures of degradation. Lastly, we present a multistate model used to model a recurrent event process and failure-times. We compute the expected time to failure using counting process theory and investigate the effect of the event process on the expected failure-time estimates

    Degradation Modeling and RUL Prediction Using Wiener Process Subject to Multiple Change Points and Unit Heterogeneity

    Get PDF
    Degradation modeling is critical for health condition monitoring and remaining useful life prediction (RUL). The prognostic accuracy highly depends on the capability of modeling the evolution of degradation signals. In many practical applications, however, the degradation signals show multiple phases, where the conventional degradation models are often inadequate. To better characterize the degradation signals of multiple-phase characteristics, we propose a multiple change-point Wiener process as a degradation model. To take into account the between-unit heterogeneity, a fully Bayesian approach is developed where all model parameters are assumed random. At the offline stage, an empirical two-stage process is proposed for model estimation, and a cross-validation approach is adopted for model selection. At the online stage, an exact recursive model updating algorithm is developed for online individual model estimation, and an effective Monte Carlo simulation approach is proposed for RUL prediction. The effectiveness of the proposed method is demonstrated through thorough simulation studies and real case study

    A Data-Driven Predictive Model of Reliability Estimation Using State-Space Stochastic Degradation Model

    Get PDF
    The concept of the Industrial Internet of Things (IIoT) provides the foundation to apply data-driven methodologies. The data-driven predictive models of reliability estimation can become a major tool in increasing the life of assets, lowering capital cost, and reducing operating and maintenance costs. Classical models of reliability assessment mainly rely on lifetime data. Failure data may not be easily obtainable for highly reliable assets. Furthermore, the collected historical lifetime data may not be able to accurately describe the behavior of the asset in a unique application or environment. Therefore, it is not an optimal approach anymore to conduct a reliability estimation based on classical models. Fortunately, most of the industrial assets have performance characteristics whose degradation or decay over the operating time can be related to their reliability estimates. The application of the degradation methods has been recently increasing due to their ability to keep track of the dynamic conditions of the system over time. The main purpose of this study is to develop a data-driven predictive model of reliability assessment based on real-time data using a state-space stochastic degradation model to predict the critical time for initiating maintenance actions in order to enhance the value and prolonging the life of assets. The new degradation model developed in this thesis is introducing a new mapping function for the General Path Model based on series of Gamma Processes degradation models in the state-space environment by considering Poisson distributed weights for each of the Gamma processes. The application of the developed algorithm is illustrated for the distributed electrical systems as a generic use case. A data-driven algorithm is developed in order to estimate the parameters of the new degradation model. Once the estimates of the parameters are available, distribution of the failure time, time-dependent distribution of the degradation, and reliability based on the current estimate of the degradation can be obtained

    Model-Assisted Estimators for Time-to-Event Data

    Get PDF
    In this dissertation, I develop model-assisted estimators for estimating the proportion of a population that experienced some event by time t. I provide the theoretical justification for the new estimators using time-to-event models as the underlying framework. Using simulation, I compared these estimators to traditional methods, then I applied the estimators to a study of nurses’ health, where I estimated the proportion of the population that had died after a certain period of time. The new estimators performed as well if not better than existing methods. Finally, as this work assumes that all units are censored at the same point in time, I propose an extension that allows units censoring time to vary

    Statistical and image analysis methods and applications

    Get PDF

    On the integration of deterministic opinions into mortality smoothing and forecasting

    Get PDF

    Threshold Regression Models Adapted to Case-Control Studies, and the Risk of Lung Cancer Due to Occupational Exposure to Asbestos in France

    Get PDF
    Asbestos has been known for many years as a powerful carcinogen. Our purpose is quantify the relationship between an occupational exposure to asbestos and an increase of the risk of lung cancer. Furthermore, we wish to tackle the very delicate question of the evaluation, in subjects suffering from a lung cancer, of how much the amount of exposure to asbestos explains the occurrence of the cancer. For this purpose, we rely on a recent French case-control study. We build a large collection of threshold regression models, data-adaptively select a better model in it by multi-fold likelihood-based cross-validation, then fit the resulting better model by maximum likelihood. A necessary preliminary step to eliminate the bias due to the case-control sampling design is made possible because the probability distribution of being a case can be computed beforehand based on an independent study. The implications of the fitted model in terms of a notion of maximum number of years of life guaranteed free of lung cancer are discussed

    Uncovering Randomness and Success in Society

    Get PDF
    An understanding of how individuals shape and impact the evolution of society is vastly limited due to the unavailability of large-scale reliable datasets that can simultaneously capture information regarding individual movements and social interactions. We believe that the popular Indian film industry, 'Bollywood', can provide a social network apt for such a study. Bollywood provides massive amounts of real, unbiased data that spans more than 100 years, and hence this network has been used as a model for the present paper. The nodes which maintain a moderate degree or widely cooperate with the other nodes of the network tend to be more fit (measured as the success of the node in the industry) in comparison to the other nodes. The analysis carried forth in the current work, using a conjoined framework of complex network theory and random matrix theory, aims to quantify the elements that determine the fitness of an individual node and the factors that contribute to the robustness of a network. The authors of this paper believe that the method of study used in the current paper can be extended to study various other industries and organizations.Comment: 39 pages, 12 figures, 14 table

    Parallel simulation based adaptive prediction for equipment remaining useful life

    Get PDF
    The latest demands for remaining useful life (RUL) prediction are online prediction, real-time prediction and adaptive prediction. This paper addresses the demands of RUL prediction and proposes a novel framework of parallel simulation based adaptive prediction for equipment RUL. In the framework, a Wiener state space model (WSSM) is developed to achieve the aim, which considers the whole historical data and monitoring noise. Driven by the online observation data, the degradation state is estimated by the Kalman filter based data assimilation and the WSSM parameters are updated by the expectation maximum algorithm. An analytical RUL distribution considering the distribution of the degradation state is obtained based on the concept of the first hitting time. A case study for GaAs laser device is provided to substantiate the superiority of the proposed method compared with the competing method of traditional Wiener process. The results show that the parallel simulation method can provide better RUL prognostic accuracy
    • …
    corecore