24 research outputs found

    A comparison of full model specification and backward elimination of potential confounders when estimating marginal and conditional causal effects on binary outcomes from observational data

    Get PDF
    A common view in epidemiology is that automated confounder selection methods, such as backward elimination, should be avoided as they can lead to biased effect estimates and underestimation of their variance. Nevertheless, backward elimination remains regularly applied. We investigated if and under which conditions causal effect estimation in observational studies can improve by using backward elimination on a prespecified set of potential confounders. An expression was derived that quantifies how variable omission relates to bias and variance of effect estimators. Additionally, 3960 scenarios were defined and investigated by simulations comparing bias and mean squared error (MSE) of the conditional log odds ratio, log(cOR), and the marginal log risk ratio, log(mRR), between full models including all prespecified covariates and backward elimination of these covariates. Applying backward elimination resulted in a mean bias of 0.03 for log(cOR) and 0.02 for log(mRR), compared to 0.56 and 0.52 for log(cOR) and log(mRR), respectively, for a model without any covariate adjustment, and no bias for the full model. In less than 3% of the scenarios considered, the MSE of the log(cOR) or log(mRR) was slightly lower (max 3%) when backward elimination was used compared to the full model. When an initial set of potential confounders can be specified based on background knowledge, there is minimal added value of backward elimination. We advise not to use it and otherwise to provide ample arguments supporting its use

    Data from a pre-publication independent replication initiative examining ten moral judgement effects

    Get PDF
    We present the data from a crowdsourced project seeking to replicate findings in independent laboratories before (rather than after) they are published. In this Pre-Publication Independent Replication (PPIR) initiative, 25 research groups attempted to replicate 10 moral judgment effects from a single laboratory's research pipeline of unpublished findings. The 10 effects were investigated using online/lab surveys containing psychological manipulations (vignettes) followed by questionnaires. Results revealed a mix of reliable, unreliable, and culturally moderated findings. Unlike any previous replication project, this dataset includes the data from not only the replications but also from the original studies, creating a unique corpus that researchers can use to better understand reproducibility and irreproducibility in science

    The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline

    Get PDF
    This crowdsourced project introduces a collaborative approach to improving the reproducibility of scientific research, in which findings are replicated in qualified independent laboratories before (rather than after) they are published. Our goal is to establish a non-adversarial replication process with highly informative final results. To illustrate the Pre-Publication Independent Replication (PPIR) approach, 25 research groups conducted replications of all ten moral judgment effects which the last author and his collaborators had “in the pipeline” as of August 2014. Six findings replicated according to all replication criteria, one finding replicated but with a significantly smaller effect size than the original, one finding replicated consistently in the original culture but not outside of it, and two findings failed to find support. In total, 40% of the original findings failed at least one major replication criterion. Potential ways to implement and incentivize pre-publication independent replication on a large scale are discussed

    Verkehrspolitik fuer das 21. Jahrhundert Ein neues Langfristkonzept fuer Berlin-Brandenburg

    No full text
    Available from TIB Hannover: RA 2325(69) / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekSIGLEDEGerman

    Targeting elastase for molecular imaging of early atherosclerotic lesions.

    No full text
    Objective—Neutrophils accumulate in early atherosclerotic lesions and promote lesion growth. In this study, we evaluated an elastase-specific near-infrared imaging agent for molecular imaging using hybrid fluorescence molecular tomography/x-ray computed tomography. Approach and Results—Murine neutrophils were isolated from bone marrow and incubated with the neutrophil-targeted near-infrared imaging agent Neutrophil Elastase 680 FAST for proof of principle experiments, verifying that the elastase-targeted fluorescent agent is specifically cleaved and activated by neutrophil content after lysis or cell stimulation. For in vivo experiments, low-density lipoprotein receptor–deficient mice were placed on a Western-type diet and imaged after 4, 8, and 12 weeks by fluorescence molecular tomography/x-ray computed tomography. Although this agent remains silent on injection, it produces fluorescent signal after cleavage by neutrophil elastase. After hybrid fluorescence molecular tomography/x-ray computed tomography imaging, mice were euthanized for whole-body cryosectioning and histological analyses. The in vivo fluorescent signal in the area of the aortic arch was highest after 4 weeks of high-fat diet feeding and decreased at 8 and 12 weeks. Ex vivo whole-body cryoslicing confirmed the fluorescent signal to locate to the aortic arch and to originate from the atherosclerotic arterial wall. Histological analysis demonstrated the presence of neutrophils in atherosclerotic lesions. Conclusions—This study provides evidence that elastase-targeted imaging can be used for in vivo detection of early atherosclerosis. This imaging approach may harbor potential in the clinical setting for earlier diagnosis and treatment of atherosclerosis. &nbsp
    corecore