16 research outputs found

    Gemcitabine and docetaxel versus doxorubicin as first-line treatment in previously untreated advanced unresectable or metastatic soft-tissue sarcomas (GeDDiS): a randomised controlled phase 3 trial.

    Get PDF
    BACKGROUND: For many years, first-line treatment for locally advanced or metastatic soft-tissue sarcoma has been doxorubicin. This study compared gemcitabine and docetaxel versus doxorubicin as first-line treatment for advanced or metastatic soft-tissue sarcoma. METHODS: The GeDDiS trial was a randomised controlled phase 3 trial done in 24 UK hospitals and one Swiss Group for Clinical Cancer Research (SAKK) hospital. Eligible patients had histologically confirmed locally advanced or metastatic soft-tissue sarcoma of Trojani grade 2 or 3, disease progression before enrolment, and no previous chemotherapy for sarcoma or previous doxorubicin for any cancer. Patients were randomly assigned 1:1 to receive six cycles of intravenous doxorubicin 75 mg/m(2) on day 1 every 3 weeks, or intravenous gemcitabine 675 mg/m(2) on days 1 and 8 and intravenous docetaxel 75 mg/m(2) on day 8 every 3 weeks. Treatment was assigned using a minimisation algorithm incorporating a random element. Randomisation was stratified by age (≤18 years vs >18 years) and histological subtype. The primary endpoint was the proportion of patients alive and progression free at 24 weeks in the intention-to-treat population. Adherence to treatment and toxicity were analysed in the safety population, consisting of all patients who received at least one dose of their randomised treatment. The trial was registered with the European Clinical Trials (EudraCT) database (no 2009-014907-29) and with the International Standard Randomised Controlled Trial registry (ISRCTN07742377), and is now closed to patient entry. FINDINGS: Between Dec 3, 2010, and Jan 20, 2014, 257 patients were enrolled and randomly assigned to the two treatment groups (129 to doxorubicin and 128 to gemcitabine and docetaxel). Median follow-up was 22 months (IQR 15·7-29·3). The proportion of patients alive and progression free at 24 weeks did not differ between those who received doxorubicin versus those who received gemcitabine and docetaxel (46·3% [95% CI 37·5-54·6] vs 46·4% [37·5-54·8]); median progression-free survival (23·3 weeks [95% CI 19·6-30·4] vs 23·7 weeks [18·1-20·0]; hazard ratio [HR] for progression-free survival 1·28, 95% CI 0·99-1·65, p=0·06). The most common grade 3 and 4 adverse events were neutropenia (32 [25%] of 128 patients who received doxorubicin and 25 [20%] of 126 patients who received gemcitabine and docetaxel), febrile neutropenia (26 [20%] and 15 [12%]), fatigue (eight [6%] and 17 [14%]), oral mucositis (18 [14%] and two [2%]), and pain (ten [8%] and 13 [10%]). The three most common serious adverse events, representing 111 (39%) of all 285 serious adverse events recorded, were febrile neutropenia (27 [17%] of 155 serious adverse events in patients who received doxorubicin and 15 [12%] of 130 serious adverse events in patients who received gemcitabine and docetaxel, fever (18 [12%] and 19 [15%]), and neutropenia (22 [14%] and ten [8%]). 154 (60%) of 257 patients died in the intention-to-treat population: 74 (57%) of 129 patients in the doxorubicin group and 80 (63%) of 128 in the gemcitabine and docetaxel group. No deaths were related to the treatment, but two deaths were due to a combination of disease progression and treatment. INTERPRETATION: Doxorubicin should remain the standard first-line treatment for most patients with advanced soft-tissue sarcoma. These results provide evidence for clinicians to consider with their patients when selecting first-line treatment for locally advanced or metastatic soft-tissue sarcoma. FUNDING: Cancer Research UK, Sarcoma UK, and Clinical Trial Unit Kantonsspital St Gallen

    Implementation of the "FASTHUG" concept decreases the incidence of ventilator-associated pneumonia in a surgical intensive care unit

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ventilator-associated pneumonia (VAP) is a leading cause of morbidity and mortality in critically ill patients. The Institute for Healthcare Improvement 100,000 Lives Campaign made VAP a target of prevention and performance improvement. Additionally, the Joint Commission on Accreditation of Health Organizations' 2007 Disease Specific National Patient Safety Goals included the reduction of healthcare-associated infections. We report implementation of a performance improvement project that dramatically reduced our VAP rate that had exceeded the 90<sup>th </sup>percentile nationally.</p> <p>Methods</p> <p>From 1 January 2004 to 31 December 2005 a performance improvement project was undertaken to decrease our critical care unit VAP rate. In year one (2004) procedural interventions were highlighted: aggressive oral care, early extubation, management of soiled or malfunctioning respiratory equipment, hand washing surveillance, and maximal sterile barrier precautions. In year two (2005) an evaluative concept called FASTHUG (daily evaluation of patients' feeding, analgesia, sedation, thromboembolic prophylaxis, elevation of the head of the bed, ulcer prophylaxis, and glucose control) was implemented. To determine the long-term effectiveness of such an intervention a historical control period (2003) and the procedural intervention period of 2004, i.e., the pre-FASTHUG period (months 1–24) were compared with an extended post-FASTHUG period (months 25–54).</p> <p>Results</p> <p>The 2003 surgical intensive care VAP rate of 19.3/1000 ventilator-days served as a historical control. Procedural interventions in 2004 were not effective in reducing VAP, p = 0.62. However, implementation of FASTHUG in 2005, directed by a critical care team, resulted in a rate of 7.3/1000 ventilator-days, p ≤ .01. The median pneumonia rate was lower after implementation of FASTHUG when compared to the historical control year (p = .028) and the first year after the procedural interventions (p = .041) using follow-up pairwise comparisons. The pre-FASTHUG period (2003–2004, months 1–24) when compared with an extended post-FASTHUG period (2005–2007, 25–54 months) also demonstrated a significant decrease in the VAP rate, p = .0004. This reduction in the post-FASTHUG period occurred despite a rising Severity of Illness index in critically ill patients, p = .001.</p> <p>Conclusion</p> <p>Implementation of the FASTHUG concept, in the daily evaluation of mechanically ventilated patients, significantly decreased our surgical intensive care unit VAP rate.</p

    Capturing sequence diversity in metagenomes with comprehensive and scalable probe design.

    Get PDF
    Metagenomic sequencing has the potential to transform microbial detection and characterization, but new tools are needed to improve its sensitivity. Here we present CATCH, a computational method to enhance nucleic acid capture for enrichment of diverse microbial taxa. CATCH designs optimal probe sets, with a specified number of oligonucleotides, that achieve full coverage of, and scale well with, known sequence diversity. We focus on applying CATCH to capture viral genomes in complex metagenomic samples. We design, synthesize, and validate multiple probe sets, including one that targets the whole genomes of the 356 viral species known to infect humans. Capture with these probe sets enriches unique viral content on average 18-fold, allowing us to assemble genomes that could not be recovered without enrichment, and accurately preserves within-sample diversity. We also use these probe sets to recover genomes from the 2018 Lassa fever outbreak in Nigeria and to improve detection of uncharacterized viral infections in human and mosquito samples. The results demonstrate that CATCH enables more sensitive and cost-effective metagenomic sequencing

    Harnessing the NEON data revolution to advance open environmental science with a diverse and data-capable community

    Get PDF
    It is a critical time to reflect on the National Ecological Observatory Network (NEON) science to date as well as envision what research can be done right now with NEON (and other) data and what training is needed to enable a diverse user community. NEON became fully operational in May 2019 and has pivoted from planning and construction to operation and maintenance. In this overview, the history of and foundational thinking around NEON are discussed. A framework of open science is described with a discussion of how NEON can be situated as part of a larger data constellation—across existing networks and different suites of ecological measurements and sensors. Next, a synthesis of early NEON science, based on >100 existing publications, funded proposal efforts, and emergent science at the very first NEON Science Summit (hosted by Earth Lab at the University of Colorado Boulder in October 2019) is provided. Key questions that the ecology community will address with NEON data in the next 10 yr are outlined, from understanding drivers of biodiversity across spatial and temporal scales to defining complex feedback mechanisms in human–environmental systems. Last, the essential elements needed to engage and support a diverse and inclusive NEON user community are highlighted: training resources and tools that are openly available, funding for broad community engagement initiatives, and a mechanism to share and advertise those opportunities. NEON users require both the skills to work with NEON data and the ecological or environmental science domain knowledge to understand and interpret them. This paper synthesizes early directions in the community’s use of NEON data, and opportunities for the next 10 yr of NEON operations in emergent science themes, open science best practices, education and training, and community building

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Immuno-spin trapping of protein and DNA radicals: “tagging” free radicals to locate and understand the redox process

    No full text
    Biomolecule-centered radicals are intermediate species produced during both reversible (redox modulation) and irreversible (oxidative stress) oxidative modification of biomolecules. These oxidative processes must be studied <em>in situ</em> and in real time in order to understand the molecular mechanism of cell adaptation or death in response to changes in the extracellular environment. In this regard, we have developed and validated immuno-spin trapping to tag the redox process, tracing the oxidatively-generated modification of biomolecules, <em>in situ</em> and in real time, by detecting protein- and DNA-centered radicals. The purpose of this method article is to introduce and update the basic methods and applications of immuno-spin trapping for the study of redox biochemistry in oxidative stress and redox regulation. We describe in detail the production, detection and location of protein and DNA radicals in biochemical systems, cells, and tissues, and in the whole animal as well, by using immuno-spin trapping with the nitrone spin trap 5,5-dimethyl-1-pyrroline <em>N</em>-oxide (DMPO)
    corecore