37,133 research outputs found

    Reliability estimation for the randomly censored pareto distribution

    Get PDF
    Widespread applications of random censoring in life testing experiments to estimate reliability of engineering products or systems are avialable. Different parametric statistical models such as exponential, Rayleigh, Weibull and Maxwell distributions are used under random censoring scheme. In this paper, random censoring under Pareto distribution is considered. The maximum likelihood estimators (MLE’s) of the model parameters and survival function were derived along with Fisher information matrix and asymptotic confidence intervals. A simulation study was performed to observe the behavior of the MLE’s. The simulation results showed that the bias and MSE were reasonably small in all cases

    Geoadditive hazard regression for interval censored survival times

    Get PDF
    The Cox proportional hazards model is the most commonly used method when analyzing the impact of covariates on continuous survival times. In its classical form, the Cox model was introduced in the setting of right-censored observations. However, in practice other sampling schemes are frequently encountered and therefore extensions allowing for interval and left censoring or left truncation are clearly desired. Furthermore, many applications require a more flexible modeling of covariate information than the usual linear predictor. For example, effects of continuous covariates are likely to be of nonlinear form or spatial information is to be included appropriately. Further extensions should allow for time-varying effects of covariates or covariates that are themselves time-varying. Such models relax the assumption of proportional hazards. We propose a regression model for the hazard rate that combines and extends the above-mentioned features on the basis of a unifying Bayesian model formulation. Nonlinear and time-varying effects as well as the baseline hazard rate are modeled by penalized splines. Spatial effects can be included based on either Markov random fields or stationary Gaussian random fields. The model allows for arbitrary combinations of left, right and interval censoring as well as left truncation. Estimation is based on a reparameterisation of the model as a variance components mixed model. The variance parameters corresponding to inverse smoothing parameters can then be estimated based on an approximate marginal likelihood approach. As an application we present an analysis on childhood mortality in Nigeria, where the interval censoring framework also allows to deal with the problem of heaped survival times caused by memory effects. In a simulation study we investigate the effect of ignoring the impact of interval censored observations

    Informed censoring: the parametric combination of data and expert information

    Full text link
    The statistical censoring setup is extended to the situation when random measures can be assigned to the realization of datapoints, leading to a new way of incorporating expert information into the usual parametric estimation procedures. The asymptotic theory is provided for the resulting estimators, and some special cases of practical relevance are studied in more detail. Although the proposed framework mathematically generalizes censoring and coarsening at random, and borrows techniques from M-estimation theory, it provides a novel and transparent methodology which enjoys significant practical applicability in situations where expert information is present. The potential of the approach is illustrated by a concrete actuarial application of tail parameter estimation for a heavy-tailed MTPL dataset with limited available expert information

    A shared-parameter continuous-time hidden Markov and survival model for longitudinal data with informative dropout

    Get PDF
    A shared-parameter approach for jointly modeling longitudinal and survival data is proposed. With respect to available approaches, it allows for time-varying random effects that affect both the longitudinal and the survival processes. The distribution of these random effects is modeled according to a continuous-time hidden Markov chain so that transitions may occur at any time point. For maximum likelihood estimation, we propose an algorithm based on a discretization of time until censoring in an arbitrary number of time windows. The observed information matrix is used to obtain standard errors. We illustrate the approach by simulation, even with respect to the effect of the number of time windows on the precision of the estimates, and by an application to data about patients suffering from mildly dilated cardiomyopathy

    Nonparametric estimation under censoring and passive registration

    Get PDF
    The classical random censorship model assumes that we follow an individual continuously up to the time of failure or censoring so observing this time as well as the indicator of its type Under passive registration we only get information on the state of the individual at random observation or registration times In this paper we assume that these registration times are the times of events in an independent Poisson process stopped at failure or censoring the time of failure is also observed if not censored This problem turns up in historical demography where the survival time of interest is the lifelength censoring is by emigration and the observation times are times of births of children and other lifeevents Church registers contain dates of births marriages deaths but not emigrations The model is shown to be related to the problem of estimating a density known to be monotone This leads to an explicit description of the non-parametric maximum likelihood estimator of the survival function based on iid-observations from this model and to an analysis of its large sample propertie

    A class of nonparametric bivariate survival function estimators for randomly censored and truncated data

    Get PDF
    This paper proposes a class of nonparametric estimators for the bivariate survival function estimation under both random truncation and random censoring. In practice, the pair of random variables under consideration may have certain parametric relationship. The proposed class of nonparametric estimators uses such parametric information via a data transformation approach and thus provides more accurate estimates than existing methods without using such information. The large sample properties of the new class of estimators and a general guidance of how to find a good data transformation are given. The proposed method is also justified via a simulation study and an application on an economic data set.</p

    Tests for random signs censoring in competing risks

    Get PDF
    In the setting of competing risks, the marginal survival functions of the latent failure times are nonidentifiable without making further assumptions about the joint distribution, the majority of which are untestable. One exception is the random signs censoring assumption which assumes the main event time is independent of the indicator that the main event preceded the competing event. Few methods exist to formally test this assumption, and none consider a stratified test, which detects whether random signs censoring is met within subgroups of a categorical covariate. We develop a nonparametric stratified test for random signs censoring that is easy to implement. In addition, it is often of interest to model the effects of several covariates in relation to the cause of interest. Thus, as an extension of the stratified test, we also propose a test for conditional random signs censoring, which allows for the random signs censoring assumption to be met after adjusting for categorical and/or continuous covariates. Through Monte Carlo simulations, we show our proposed test statistics have empirical levels close to the nominal level and maintain adequate power even with relatively small sample sizes and random right censoring. Compared to the standard test, both of our proposed tests have nearly equivalent power under random signs censoring and are superior in situations of stratified or conditional random signs censoring, where the standard test fails to detect random signs censoring within subgroups or after adjusting for covariates, respectively. Their ease of implementation and utility are illustrated through an application to liver transplant data from the United Network for Organ Sharing. Public Health Significance: Clinicians must make decisions affecting patients’ lives using the information available to them. Relying on research results based on models that use unverifiable assumptions can lead to inaccurate conclusions. The methods proposed here offer a solution to allow for more accurate modeling of marginal survival functions with competing risk data. Through use of these new methods, patient outcomes can be improved over time

    A mixed model approach for structured hazard regression

    Get PDF
    The classical Cox proportional hazards model is a benchmark approach to analyze continuous survival times in the presence of covariate information. In a number of applications, there is a need to relax one or more of its inherent assumptions, such as linearity of the predictor or the proportional hazards property. Also, one is often interested in jointly estimating the baseline hazard together with covariate effects or one may wish to add a spatial component for spatially correlated survival data. We propose an extended Cox model, where the (log-)baseline hazard is weakly parameterized using penalized splines and the usual linear predictor is replaced by a structured additive predictor incorporating nonlinear effects of continuous covariates and further time scales, spatial effects, frailty components, and more complex interactions. Inclusion of time-varying coefficients leads to models that relax the proportional hazards assumption. Nonlinear and time-varying effects are modelled through penalized splines, and spatial components are treated as correlated random effects following either a Markov random field or a stationary Gaussian random field. All model components, including smoothing parameters, are specified within a unified framework and are estimated simultaneously based on mixed model methodology. The estimation procedure for such general mixed hazard regression models is derived using penalized likelihood for regression coefficients and (approximate) marginal likelihood for smoothing parameters. Performance of the proposed method is studied through simulation and an application to leukemia survival data in Northwest England
    • …
    corecore