1,274 research outputs found

    Capture the fracture: a best practice framework and global campaign to break the fragility fracture cycle

    Get PDF
    Summary The International Osteoporosis Foundation (IOF) Capture the Fracture Campaign aims to support implementation of Fracture Liaison Services (FLS) throughout the world. Introduction FLS have been shown to close the ubiquitous secondary fracture prevention care gap, ensuring that fragility fracture sufferers receive appropriate assessment and intervention to reduce future fracture risk. Methods Capture the Fracture has developed internationally endorsed standards for best practice, will facilitate change at the national level to drive adoption of FLS and increase awareness of the challenges and opportunities presented by secondary fracture prevention to key stakeholders. The Best Practice Framework (BPF) sets an international benchmark for FLS, which defines essential and aspirational elements of service delivery. Results The BPF has been reviewed by leading experts from many countries and subject to beta-testing to ensure that it is internationally relevant and fit-for-purpose. The BPF will also serve as a measurement tool for IOF to award ‘Capture the Fracture Best Practice Recognition’ to celebrate successful FLS worldwide and drive service development in areas of unmet need. The Capture the Fracture website will provide a suite of resources related to FLS and secondary fracture prevention, which will be updated as new materials become available. A mentoring programme will enable those in the early stages of development of FLS to learn from colleagues elsewhere that have achieved Best Practice Recognition. A grant programme is in development to aid clinical systems which require financial assistance to establish FLS in their localities. Conclusion Nearly half a billion people will reach retirement age during the next 20 years. IOF has developed Capture the Fracture because this is the single most important thing that can be done to directly improve patient care, of both women and men, and reduce the spiralling fracture-related care costs worldwide.</p

    Comparison of techniques for handling missing covariate data within prognostic modelling studies: a simulation study

    Get PDF
    Background: There is no consensus on the most appropriate approach to handle missing covariate data within prognostic modelling studies. Therefore a simulation study was performed to assess the effects of different missing data techniques on the performance of a prognostic model. Methods: Datasets were generated to resemble the skewed distributions seen in a motivating breast cancer example. Multivariate missing data were imposed on four covariates using four different mechanisms; missing completely at random (MCAR), missing at random (MAR), missing not at random (MNAR) and a combination of all three mechanisms. Five amounts of incomplete cases from 5% to 75% were considered. Complete case analysis (CC), single imputation (SI) and five multiple imputation (MI) techniques available within the R statistical software were investigated: a) data augmentation (DA) approach assuming a multivariate normal distribution, b) DA assuming a general location model, c) regression switching imputation, d) regression switching with predictive mean matching (MICE-PMM) and e) flexible additive imputation models. A Cox proportional hazards model was fitted and appropriate estimates for the regression coefficients and model performance measures were obtained. Results: Performing a CC analysis produced unbiased regression estimates, but inflated standard errors, which affected the significance of the covariates in the model with 25% or more missingness. Using SI, underestimated the variability; resulting in poor coverage even with 10% missingness. Of the MI approaches, applying MICE-PMM produced, in general, the least biased estimates and better coverage for the incomplete covariates and better model performance for all mechanisms. However, this MI approach still produced biased regression coefficient estimates for the incomplete skewed continuous covariates when 50% or more cases had missing data imposed with a MCAR, MAR or combined mechanism. When the missingness depended on the incomplete covariates, i.e. MNAR, estimates were biased with more than 10% incomplete cases for all MI approaches. Conclusion: The results from this simulation study suggest that performing MICE-PMM may be the preferred MI approach provided that less than 50% of the cases have missing data and the missing data are not MNAR

    Application of the speed-duration relationship to normalize the intensity of high-intensity interval training

    Get PDF
    The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P&#60;0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P&#62;0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P&#62;0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols

    A Guide to Handling Missing Data in Cost-Effectiveness Analysis Conducted Within Randomised Controlled Trials

    Get PDF
    The authors would like to thank Professor Adrian Grant and the team at the University of Aberdeen (Professor Craig Ramsay, Janice Cruden, Charles Boachie, Professor Marion Campbell and Seonaidh Cotton) who kindly allowed the REFLUX dataset to be used for this work, and Eldon Spackman for kindly sharing the Stata (R) code for calculating the probability that an intervention is cost effective following MI. The authors are grateful to the reviewers for their comments, which greatly improved this paper. M. G. is recipient of a Medical Research Council Early Career Fellowship in Economics of Health (grant number: MR/K02177X/1). I. R. W. was supported by the Medical Research Council [Unit Programme U105260558]. No specific funding was obtained to produce this paper. The authors declare no conflicts of interest.Missing data are a frequent problem in cost-effectiveness analysis (CEA) within a randomised controlled trial. Inappropriate methods to handle missing data can lead to misleading results and ultimately can affect the decision of whether an intervention is good value for money. This article provides practical guidance on how to handle missing data in within-trial CEAs following a principled approach: (i) the analysis should be based on a plausible assumption for the missing data mechanism, i.e. whether the probability that data are missing is independent of or dependent on the observed and/or unobserved values; (ii) the method chosen for the base-case should fit with the assumed mechanism; and (iii) sensitivity analysis should be conducted to explore to what extent the results change with the assumption made. This approach is implemented in three stages, which are described in detail: (1) descriptive analysis to inform the assumption on the missing data mechanism; (2) how to choose between alternative methods given their underlying assumptions; and (3) methods for sensitivity analysis. The case study illustrates how to apply this approach in practice, including software code. The article concludes with recommendations for practice and suggestions for future research.Medical Research Council Early Career Fellowship in Economics of Health MR/K02177X/1Medical Research Council UK (MRC) U105260558Medical Research Council UK (MRC) MC_U105260558 MR/K02177X/

    Epidemiology of a Daphnia-Multiparasite System and Its Implications for the Red Queen

    Get PDF
    The Red Queen hypothesis can explain the maintenance of host and parasite diversity. However, the Red Queen requires genetic specificity for infection risk (i.e., that infection depends on the exact combination of host and parasite genotypes) and strongly virulent effects of infection on host fitness. A European crustacean (Daphnia magna) - bacterium (Pasteuria ramosa) system typifies such specificity and high virulence. We studied the North American host Daphnia dentifera and its natural parasite Pasteuria ramosa, and also found strong genetic specificity for infection success and high virulence. These results suggest that Pasteuria could promote Red Queen dynamics with D. dentifera populations as well. However, the Red Queen might be undermined in this system by selection from a more common yeast parasite (Metschnikowia bicuspidata). Resistance to the yeast did not correlate with resistance to Pasteuria among host genotypes, suggesting that selection by Metschnikowia should proceed relatively independently of selection by Pasteuria

    Automated array-CGH optimized for archival formalin-fixed, paraffin-embedded tumor material

    Get PDF
    BACKGROUND: Array Comparative Genomic Hybridization (aCGH) is a rapidly evolving technology that still lacks complete standardization. Yet, it is of great importance to obtain robust and reproducible data to enable meaningful multiple hybridization comparisons. Special difficulties arise when aCGH is performed on archival formalin-fixed, paraffin-embedded (FFPE) tissue due to its variable DNA quality. Recently, we have developed an effective DNA quality test that predicts suitability of archival samples for BAC aCGH. METHODS: In this report, we first used DNA from a cancer cell-line (SKBR3) to optimize the aCGH protocol for automated hybridization, and subsequently optimized and validated the procedure for FFPE breast cancer samples. We aimed for highest throughput, accuracy, and reproducibility applicable to FFPE samples, which can also be important in future diagnostic use. RESULTS: Our protocol of automated array-CGH on archival FFPE ULS-labeled DNA showed very similar results compared with published data and our previous manual hybridization method. CONCLUSION: This report combines automated aCGH on unamplified archival FFPE DNA using non-enzymatic ULS labeling, and describes an optimized protocol for this combination resulting in improved quality and reproducibility
    • …
    corecore