61 research outputs found

    A Bayesian adaptive design for biomarker trials with linked treatments.

    Get PDF
    BACKGROUND: Response to treatments is highly heterogeneous in cancer. Increased availability of biomarkers and targeted treatments has led to the need for trial designs that efficiently test new treatments in biomarker-stratified patient subgroups. METHODS: We propose a novel Bayesian adaptive randomisation (BAR) design for use in multi-arm phase II trials where biomarkers exist that are potentially predictive of a linked treatment's effect. The design is motivated in part by two phase II trials that are currently in development. The design starts by randomising patients to the control treatment or to experimental treatments that the biomarker profile suggests should be active. At interim analyses, data from treated patients are used to update the allocation probabilities. If the linked treatments are effective, the allocation remains high; if ineffective, the allocation changes over the course of the trial to unlinked treatments that are more effective. RESULTS: Our proposed design has high power to detect treatment effects if the pairings of treatment with biomarker are correct, but also performs well when alternative pairings are true. The design is consistently more powerful than parallel-groups stratified trials. CONCLUSIONS: This BAR design is a powerful approach to use when there are pairings of biomarkers with treatments available for testing simultaneously.This work was supported by the Medical Research Council (grant number G0800860) and the NIHR Cambridge Biomedical Research Centre.This is the final version of the article. It first appeared from NPG via http://dx.doi.org/10.1038/bjc.2015.27

    Prospective study evaluating the relative sensitivity of 18F-NaF PET/CT for detecting skeletal metastases from renal cell carcinoma in comparison to multidetector CT and 99mTc-MDP bone scintigraphy, using an adaptive trial design.

    Get PDF
    BACKGROUND: The detection of occult bone metastases is a key factor in determining the management of patients with renal cell carcinoma (RCC), especially when curative surgery is considered. This prospective study assessed the sensitivity of (18)F-labelled sodium fluoride in conjunction with positron emission tomography/computed tomography ((18)F-NaF PET/CT) for detecting RCC bone metastases, compared with conventional imaging by bone scintigraphy or CT. PATIENTS AND METHODS: An adaptive two-stage trial design was utilized, which was stopped after the first stage due to statistical efficacy. Ten patients with stage IV RCC and bone metastases were imaged with (18)F-NaF PET/CT and (99m)Tc-labelled methylene diphosphonate ((99m)Tc-MDP) bone scintigraphy including pelvic single photon emission computed tomography (SPECT). Images were reported independently by experienced radiologists and nuclear medicine physicians using a 5-point scoring system. RESULTS: Seventy-seven lesions were diagnosed as malignant: 100% were identified by (18)F-NaF PET/CT, 46% by CT and 29% by bone scintigraphy/SPECT. Standard-of-care imaging with CT and bone scintigraphy identified 65% of the metastases reported by (18)F-NaF PET/CT. On an individual patient basis, (18)F-NaF PET/CT detected more RCC metastases than (99m)Tc-MDP bone scintigraphy/SPECT or CT alone (P = 0.007). The metabolic volumes, mean and maximum standardized uptake values (SUV mean and SUV max) of the malignant lesions were significantly greater than those of the benign lesions (P < 0.001). CONCLUSIONS: (18)F-NaF PET/CT is significantly more sensitive at detecting RCC skeletal metastases than conventional bone scintigraphy or CT. The detection of occult bone metastases could greatly alter patient management, particularly in the context when standard-of-care imaging is negative for skeletal metastases.This work was supported by Cancer Research UK [grant number C19212/A16628]. The authors also received research support from the National Institute of Health Research Cambridge Biomedical Research Centre, Engineering and Physical Sciences Research Council Imaging Centre in Cambridge and Manchester, and the Cambridge Experimental Cancer Medicine Centre. The research has also been partly funded by a generous donation from the family and friends of a patient.This is the final version of the article. It first appeared from Oxford University Press via http://dx.doi.org/10.1093/annonc/mdv28

    Evaluating the drivers of and obstacles to the willingness to use cognitive enhancement drugs: the influence of drug characteristics, social environment, and personal characteristics

    Get PDF
    Sattler S, Mehlkop G, Graeff P, Sauer C. Evaluating the drivers of and obstacles to the willingness to use cognitive enhancement drugs: the influence of drug characteristics, social environment, and personal characteristics. Substance Abuse Treatment, Prevention, and Policy. 2014;9(1): 8.Background The use of cognitive enhancement (CE) by means of pharmaceutical agents has been the subject of intense debate both among scientists and in the media. This study investigates several drivers of and obstacles to the willingness to use prescription drugs non-medically for augmenting brain capacity. Methods We conducted a web-based study among 2,877 students from randomly selected disciplines at German universities. Using a factorial survey, respondents expressed their willingness to take various hypothetical CE-drugs; the drugs were described by five experimentally varied characteristics and the social environment by three varied characteristics. Personal characteristics and demographic controls were also measured. Results We found that 65.3% of the respondents staunchly refused to use CE-drugs. The results of a multivariate negative binomial regression indicated that respondents’ willingness to use CE-drugs increased if the potential drugs promised a significant augmentation of mental capacity and a high probability of achieving this augmentation. Willingness decreased when there was a high probability of side effects and a high price. Prevalent CE-drug use among peers increased willingness, whereas a social environment that strongly disapproved of these drugs decreased it. Regarding the respondents’ characteristics, pronounced academic procrastination, high cognitive test anxiety, low intrinsic motivation, low internalization of social norms against CE-drug use, and past experiences with CE-drugs increased willingness. The potential severity of side effects, social recommendations about using CE-drugs, risk preferences, and competencies had no measured effects upon willingness. Conclusions These findings contribute to understanding factors that influence the willingness to use CE-drugs. They support the assumption of instrumental drug use and may contribute to the development of prevention, policy, and educational strategies

    Imaging biomarker roadmap for cancer studies.

    Get PDF
    Imaging biomarkers (IBs) are integral to the routine management of patients with cancer. IBs used daily in oncology include clinical TNM stage, objective response and left ventricular ejection fraction. Other CT, MRI, PET and ultrasonography biomarkers are used extensively in cancer research and drug development. New IBs need to be established either as useful tools for testing research hypotheses in clinical trials and research studies, or as clinical decision-making tools for use in healthcare, by crossing 'translational gaps' through validation and qualification. Important differences exist between IBs and biospecimen-derived biomarkers and, therefore, the development of IBs requires a tailored 'roadmap'. Recognizing this need, Cancer Research UK (CRUK) and the European Organisation for Research and Treatment of Cancer (EORTC) assembled experts to review, debate and summarize the challenges of IB validation and qualification. This consensus group has produced 14 key recommendations for accelerating the clinical translation of IBs, which highlight the role of parallel (rather than sequential) tracks of technical (assay) validation, biological/clinical validation and assessment of cost-effectiveness; the need for IB standardization and accreditation systems; the need to continually revisit IB precision; an alternative framework for biological/clinical validation of IBs; and the essential requirements for multicentre studies to qualify IBs for clinical use.Development of this roadmap received support from Cancer Research UK and the Engineering and Physical Sciences Research Council (grant references A/15267, A/16463, A/16464, A/16465, A/16466 and A/18097), the EORTC Cancer Research Fund, and the Innovative Medicines Initiative Joint Undertaking (grant agreement number 115151), resources of which are composed of financial contribution from the European Union's Seventh Framework Programme (FP7/2007-2013) and European Federation of Pharmaceutical Industries and Associations (EFPIA) companies' in kind contribution

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
    corecore