54 research outputs found

    Are Nested Case-Control Studies Biased?

    Get PDF
    It has been recently asserted that the nested case-control study design, in which case-control sets are sampled from cohort risk sets, can introduce bias (“study design bias”) when there are lagged exposures. The bases for this claim include a theoretic and an “empirical evaluation” argument. Both of these arguments are examined and found to be incorrect. Appropriate methods to explore the performance of nested case-control study designs, analysis methods, and compute power and sample size from an existing cohort are described. This empirical evaluation approach relies on simulating case-control outcomes from risk sets in the cohort from which the case-control study is to be performed. Because it is based on the underlying cohort structure, the empirical evaluation can provide an assessment that is tailored to the specific characteristics of the study under consideration. The methods are illustrated using samples from the Colorado Plateau uranium miners cohort

    Background stratified Poisson regression analysis of cohort data

    Get PDF
    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as ‘nuisance’ variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this ‘conditional’ regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models

    Lagging Exposure Information in Cumulative Exposure-Response Analyses

    Get PDF
    Lagging exposure information is often undertaken to allow for a latency period in cumulative exposure-disease analyses. The authors first consider bias and confidence interval coverage when using the standard approaches of fitting models under several lag assumptions and selecting the lag that maximizes either the effect estimate or model goodness of fit. Next, they consider bias that occurs when the assumption that the latency period is a fixed constant does not hold. Expressions were derived for bias due to misspecification of lag assumptions, and simulations were conducted. Finally, the authors describe a method for joint estimation of parameters describing an exposure-response association and the latency distribution. Analyses of associations between cumulative asbestos exposure and lung cancer mortality among textile workers illustrate this approach. Selecting the lag that maximizes the effect estimate may lead to bias away from the null; selecting the lag that maximizes model goodness of fit may lead to confidence intervals that are too narrow. These problems tend to increase as the within-person exposure variation diminishes. Lagging exposure assignment by a constant will lead to bias toward the null if the distribution of latency periods is not a fixed constant. Direct estimation of latency periods can minimize bias and improve confidence interval coverage

    Crude incidence in two-phase designs in the presence of competing risks.

    Get PDF
    BackgroundIn many studies, some information might not be available for the whole cohort, some covariates, or even the outcome, might be ascertained in selected subsamples. These studies are part of a broad category termed two-phase studies. Common examples include the nested case-control and the case-cohort designs. For two-phase studies, appropriate weighted survival estimates have been derived; however, no estimator of cumulative incidence accounting for competing events has been proposed. This is relevant in the presence of multiple types of events, where estimation of event type specific quantities are needed for evaluating outcome.MethodsWe develop a non parametric estimator of the cumulative incidence function of events accounting for possible competing events. It handles a general sampling design by weights derived from the sampling probabilities. The variance is derived from the influence function of the subdistribution hazard.ResultsThe proposed method shows good performance in simulations. It is applied to estimate the crude incidence of relapse in childhood acute lymphoblastic leukemia in groups defined by a genotype not available for everyone in a cohort of nearly 2000 patients, where death due to toxicity acted as a competing event. In a second example the aim was to estimate engagement in care of a cohort of HIV patients in resource limited setting, where for some patients the outcome itself was missing due to lost to follow-up. A sampling based approach was used to identify outcome in a subsample of lost patients and to obtain a valid estimate of connection to care.ConclusionsA valid estimator for cumulative incidence of events accounting for competing risks under a general sampling design from an infinite target population is derived

    Molecularly Engineered Self-Assembling Membranes for Cell-Mediated Degradation

    Get PDF
    The use of peptide engineering to develop self-assembling membranes that are responsive to cellular enzyme activities is reported. The membranes are obtained by combining hyaluronan (HA) and a rationally designed peptide amphiphile (PA) containing a proteolytic domain (GPQGIWGQ octapeptide) sensitive to matrix metalloproteinase-1 (MMP-1). Insertion of an octapeptide in a typical PA structure does not disturb its self-assembly into fibrillar nanostructures neither the ability to form membranes with HA. In vitro enzymatic degradation with hyaluronidase and MMP-1 shows that membranes containing the MMP-1 substrate exhibit enhanced enzymatic degradation, compared with control membranes (absence of MMP-1 cleavable peptide or containing a MMP-1 insensitive sequence), being completely degraded after 7 days. Cell viability and proliferation is minimally affected by the enzymatically cleavable functionality of the membrane, but the presence of MMP-1 cleavable sequence does stimulate the secretion of MMP-1 by fibroblasts and interfere with matrix deposition, particularly the deposition of collagen. By showing cell-responsiveness to biochemical signals presented on self-assembling membranes, this study highlights the ability of modulating certain cellular activities through matrix engineering. This concept can be further explored to understand the cellular remodeling process and as a strategy to develop artificial matrices with more biomimetic degradation for tissue engineering applications.This work was funded by the European Regional Development Fund (ERDF) through the Operational Competitiveness Programme "COMPETE" (FCOMP-01-0124-FEDER-014758) and national funds through the Portuguese Foundation for Science and Technology (FCT) under the project PTDC/EBB-BIO/114523/2009. The authors also thank a start-up grant provided by the School of Engineering and Materials Science at QMUL. D.S.F. gratefully acknowledges FCT for the PhD scholarship (SFRH/BD/44977/2008)

    Hierarchical Latency Models for Dose-Time-Response Associations

    Get PDF
    Exposure lagging and exposure-time window analysis are 2 widely used approaches to allow for induction and latency periods in analyses of exposure-disease associations. Exposure lagging implies a strong parametric assumption about the temporal evolution of the exposure-disease association. An exposure-time window analysis allows for a more flexible description of temporal variation in exposure effects but may result in unstable risk estimates that are sensitive to how windows are defined. The authors describe a hierarchical regression approach that combines time window analysis with a parametric latency model. They illustrate this approach using data from 2 occupational cohort studies: studies of lung cancer mortality among 1) asbestos textile workers and 2) uranium miners. For each cohort, an exposure-time window analysis was compared with a hierarchical regression analysis with shrinkage toward a simpler, second-stage parametric latency model. In each cohort analysis, there is substantial stability gained in time window-specific estimates of association by using a hierarchical regression approach. The proposed hierarchical regression model couples a time window analysis with a parametric latency model; this approach provides a way to stabilize risk estimates derived from a time window analysis and a way to reduce bias arising from misspecification of a parametric latency model
    • 

    corecore