1,868 research outputs found

    Stability of response characteristics of a Delphi panel: application of bootstrap data expansion

    Get PDF
    BACKGROUND: Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. METHODS: The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals) for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. RESULTS: The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. CONCLUSION: Panels of similarly trained experts (who possess a general understanding in the field of interest) provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making

    Selection Bias Due to Loss to Follow Up in Cohort Studies

    Get PDF
    Selection bias due to loss to follow up represents a threat to the internal validity of estimates derived from cohort studies. Over the last fifteen years, stratification-based techniques as well as methods such as inverse probability-of-censoring weighted estimation have been more prominently discussed and offered as a means to correct for selection bias. However, unlike correcting for confounding bias using inverse weighting, uptake of inverse probability-of-censoring weighted estimation as well as competing methods has been limited in the applied epidemiologic literature. To motivate greater use of inverse probability-of-censoring weighted estimation and competing methods, we use causal diagrams to describe the sources of selection bias in cohort studies employing a time-to-event framework when the quantity of interest is an absolute measure (e.g. absolute risk, survival function) or relative effect measure (e.g., risk difference, risk ratio). We highlight that whether a given estimate obtained from standard methods is potentially subject to selection bias depends on the causal diagram and the measure. We first broadly describe inverse probability-of-censoring weighted estimation and then give a simple example to demonstrate in detail how inverse probability-of-censoring weighted estimation mitigates selection bias and describe challenges to estimation. We then modify complex, real-world data from the University of North Carolina Center for AIDS Research HIV clinical cohort study and estimate the absolute and relative change in the occurrence of death with and without inverse probability-of-censoring weighted correction using the modified University of North Carolina data. We provide SAS code to aid with implementation of inverse probability-of-censoring weighted techniques

    Parametric mixture models to evaluate and summarize hazard ratios in the presence of competing risks with time-dependent hazards and delayed entry

    Get PDF
    In the analysis of survival data, there are often competing events that preclude an event of interest from occurring. Regression analysis with competing risks is typically undertaken using a cause-specific proportional hazards model. However, modern alternative methods exist for the analysis of the subdistribution hazard with a corresponding subdistribution proportional hazards model. In this paper, we introduce a flexible parametric mixture model as a unifying method to obtain estimates of the cause-specific and subdistribution hazards and hazard ratio functions. We describe how these estimates can be summarized over time to give a single number that is comparable to the hazard ratio that is obtained from a corresponding cause-specific or subdistribution proportional hazards model. An application to the Women’s Interagency HIV Study is provided to investigate injection drug use and the time to either the initiation of effective antiretroviral therapy, or clinical disease progression as a competing event

    Behavior change interventions: the potential of ontologies for advancing science and practice

    Get PDF
    A central goal of behavioral medicine is the creation of evidence-based interventions for promoting behavior change. Scientific knowledge about behavior change could be more effectively accumulated using "ontologies." In information science, an ontology is a systematic method for articulating a "controlled vocabulary" of agreed-upon terms and their inter-relationships. It involves three core elements: (1) a controlled vocabulary specifying and defining existing classes; (2) specification of the inter-relationships between classes; and (3) codification in a computer-readable format to enable knowledge generation, organization, reuse, integration, and analysis. This paper introduces ontologies, provides a review of current efforts to create ontologies related to behavior change interventions and suggests future work. This paper was written by behavioral medicine and information science experts and was developed in partnership between the Society of Behavioral Medicine's Technology Special Interest Group (SIG) and the Theories and Techniques of Behavior Change Interventions SIG. In recent years significant progress has been made in the foundational work needed to develop ontologies of behavior change. Ontologies of behavior change could facilitate a transformation of behavioral science from a field in which data from different experiments are siloed into one in which data across experiments could be compared and/or integrated. This could facilitate new approaches to hypothesis generation and knowledge discovery in behavioral science

    Lagging Exposure Information in Cumulative Exposure-Response Analyses

    Get PDF
    Lagging exposure information is often undertaken to allow for a latency period in cumulative exposure-disease analyses. The authors first consider bias and confidence interval coverage when using the standard approaches of fitting models under several lag assumptions and selecting the lag that maximizes either the effect estimate or model goodness of fit. Next, they consider bias that occurs when the assumption that the latency period is a fixed constant does not hold. Expressions were derived for bias due to misspecification of lag assumptions, and simulations were conducted. Finally, the authors describe a method for joint estimation of parameters describing an exposure-response association and the latency distribution. Analyses of associations between cumulative asbestos exposure and lung cancer mortality among textile workers illustrate this approach. Selecting the lag that maximizes the effect estimate may lead to bias away from the null; selecting the lag that maximizes model goodness of fit may lead to confidence intervals that are too narrow. These problems tend to increase as the within-person exposure variation diminishes. Lagging exposure assignment by a constant will lead to bias toward the null if the distribution of latency periods is not a fixed constant. Direct estimation of latency periods can minimize bias and improve confidence interval coverage

    Development of the morpholino gene knockdown technique in Fundulus heteroclitus : a tool for studying molecular mechanisms in an established environmental model

    Get PDF
    Author Posting. © Elsevier B.V., 2008. This is the author's version of the work. It is posted here by permission of Elsevier B.V. for personal use, not for redistribution. The definitive version was published in Aquatic Toxicology 87 (2008): 289-295, doi:10.1016/j.aquatox.2008.02.010.A significant challenge in environmental toxicology is that many genetic and genomic tools available in laboratory models are not developed for commonly used environmental models. The Atlantic killifish (Fundulus heteroclitus) is one of the most studied teleost environmental models, yet few genetic or genomic tools have been developed for use in this species. The advancement of genetic and evolutionary toxicology will require that many of the tools developed in laboratory models be transferred into species more applicable to environmental toxicology. Antisense morpholino oligonucleotide (MO) gene knockdown technology has been widely utilized to study development in zebrafish and has been proven to be a powerful tool in toxicological investigations through direct manipulation of molecular pathways. To expand the utility of killifish as an environmental model, MO gene knockdown technology was adapted for use in Fundulus. Morpholino microinjection methods were altered to overcome the significant differences between these two species. Morpholino efficacy and functional duration were evaluated with molecular and phenotypic methods. A cytochrome P450-1A (CYP1A) MO was used to confirm effectiveness of the methodology. For CYP1A MO-injected embryos, a 70% reduction in CYP1A activity, a 86% reduction in total CYP1A protein, a significant increase in β-naphthoflavone-induced teratogenicity, and estimates of functional duration (50% reduction in activity 10 dpf, and 86% reduction in total protein 12 dpf) conclusively demonstrated that MO technologies can be used effectively in killifish and will likely be just as informative as they have been in zebrafish.This work was funded in part by the National Institute of Environmental Health Sciences through the Duke Superfund Basic Research Center (P42ES010356), the Boston University Superfund Basic Research Program (P42ES007381), and the Duke Integrated Toxicology and Environmental Health Program (ES-T32-0007031). Additional support was provided by a U.S. Environmental Protection Agency STAR fellowship awarded to C.R.F

    Ultra-fine dark matter structure in the Solar neighbourhood

    Full text link
    The direct detection of dark matter on Earth depends crucially on its density and its velocity distribution on a milliparsec scale. Conventional N-body simulations are unable to access this scale, making the development of other approaches necessary. In this paper, we apply the method developed in Fantin et al. 2008 to a cosmologically-based merger tree, transforming it into a useful instrument to reproduce and analyse the merger history of a Milky Way-like system. The aim of the model is to investigate the implications of any ultra-fine structure for the current and next generation of directional dark matter detectors. We find that the velocity distribution of a Milky Way-like Galaxy is almost smooth, due to the overlap of many streams of particles generated by multiple mergers. Only the merger of a 10^10 Msun analyse can generate significant features in the ultra-local velocity distribution, detectable at the resolution attainable by current experiments.Comment: 9 pages, 6 figures, accepted for publication in MNRA

    Worth the Weight: Using Inverse Probability Weighted Cox Models in AIDS Research

    Get PDF
    In an observational study with a time-to-event outcome, the standard analytical approach is the Cox proportional hazards regression model. As an alternative to the standard Cox model, in this article we present a method that uses inverse probability (IP) weights to estimate the effect of a baseline exposure on a time-to-event outcome. IP weighting can be used to adjust for multiple measured confounders of a baseline exposure in order to estimate marginal effects, which compare the distribution of outcomes when the entire population is exposed versus when the entire population is unexposed. For example, IP-weighted Cox models allow for estimation of the marginal hazard ratio and marginal survival curves. IP weights can also be employed to adjust for selection bias due to loss to follow-up. This approach is illustrated using an example that estimates the effect of injection drug use on time until AIDS or death among HIV-infected women

    Evolution of the Luminosity Function and Colors of Galaxies in a Lambda-CDM Universe

    Full text link
    The luminosity function of galaxies is derived from a cosmological hydrodynamic simulation of a Lambda cold dark matter (CDM) universe with the aid of a stellar population synthesis model. At z=0, the resulting B band luminosity function has a flat faint end slope of \alpha \approx -1.15 with the characteristic luminosity and the normalization in a fair agreement with observations, while the dark matter halo mass function is steep with a slope of \alpha \approx -2. The colour distribution of galaxies also agrees well with local observations. We also discuss the evolution of the luminosity function, and the colour distribution of galaxies from z=0 to 5. A large evolution of the characteristic mass in the stellar mass function due to number evolution is compensated by luminosity evolution; the characteristic luminosity increases only by 0.8 mag from z=0 to 2, and then declines towards higher redshift, while the B band luminosity density continues to increase from z=0 to 5 (but only slowly at z>3).Comment: 6 pages, including 4 figures, mn2e style. Accepted to MNRAS pink page

    A model for the postcollapse equilibrium of cosmological structure: truncated isothermal spheres from top-hat density perturbations

    Get PDF
    The postcollapse structure of objects which form by gravitational condensation out of the expanding cosmological background universe is a key element in the theory of galaxy formation. Towards this end, we have reconsidered the outcome of the nonlinear growth of a uniform, spherical density perturbation in an unperturbed background universe - the cosmological ``top-hat'' problem. We adopt the usual assumption that the collapse to infinite density at a finite time predicted by the top-hat solution is interrupted by a rapid virialization caused by the growth of small-scale inhomogeneities in the initial perturbation. We replace the standard description of the postcollapse object as a uniform sphere in virial equilibrium by a more self-consistent one as a truncated, nonsingular, isothermal sphere in virial and hydrostatic equilibrium, including for the first time a proper treatment of the finite-pressure boundary condition on the sphere. The results differ significantly from both the uniform sphere and the singular isothermal sphere approximations for the postcollapse objects. These results will have a significant effect on a wide range of applications of the Press-Schechter and other semi-analytical models to cosmology. The truncated isothermal sphere solution presented here predicts the virial temperature and integrated mass distribution of the X-ray clusters formed in the CDM model as found by detailed, 3D, numerical gas and N-body dynamical simulations remarkably well. This solution allows us to derive analytically the numerically-calibrated mass-temperature and radius-temperature scaling laws for X-ray clusters which were derived empirically by Evrard, Metzler and Navarro from simulation results for the CDM model. (Shortened)Comment: 29 pages, 7 ps figures, MNRAS-style, LaTeX. Accepted for publication in MNRAS. Minor revisions only (including additional panel in Fig.3 and additional comparison with X-ray cluster simulations
    • …
    corecore