70 research outputs found

    Seafood in Food Security: a call for bridging the terrestrial-aquatic divide

    Get PDF
    The contribution of seafood to global food security is being increasingly highlighted in policy. However, the extent to which such claims are supported in the current food security literature is unclear. This review assesses the extent to which seafood is represented in the recent food security literature, both individually and from a food systems perspective, in combination with terrestrially-based production systems. The results demonstrate that seafood remains under-researched compared to the role of terrestrial animal and plant production in food security. Furthermore, seafood and terrestrial production remain siloed, with very few papers addressing the combined contribution or relations between terrestrial and aquatic systems. We conclude that far more attention is needed to the specific and relative role of seafood in global food security and call for the integration of seafood in a wider interdisciplinary approach to global food system research

    Seafood in Food Security: A Call for Bridging the Terrestrial-Aquatic Divide

    Get PDF
    The contribution of seafood to global food security is being increasingly highlighted in policy. However, the extent to which such claims are supported in the current food security literature is unclear. This review assesses the extent to which seafood is represented in the recent food security literature, both individually and from a food systems perspective, in combination with terrestrially-based production systems. The results demonstrate that seafood remains under-researched compared to the role of terrestrial animal and plant production in food security. Furthermore, seafood and terrestrial production remain siloed, with very few papers addressing the combined contribution or relations between terrestrial and aquatic systems. We conclude that far more attention is needed to the specific and relative role of seafood in global food security and call for the integration of seafood in a wider interdisciplinary approach to global food system research

    TRY plant trait database – enhanced coverage and open access

    Get PDF
    Plant traits - the morphological, anatomical, physiological, biochemical and phenological characteristics of plants - determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits - almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives

    Percutaneous revascularization for ischemic left ventricular dysfunction: Cost-effectiveness analysis of the REVIVED-BCIS2 trial

    Get PDF
    BACKGROUND: Percutaneous coronary intervention (PCI) is frequently undertaken in patients with ischemic left ventricular systolic dysfunction. The REVIVED (Revascularization for Ischemic Ventricular Dysfunction)-BCIS2 (British Cardiovascular Society-2) trial concluded that PCI did not reduce the incidence of all-cause death or heart failure hospitalization; however, patients assigned to PCI reported better initial health-related quality of life than those assigned to optimal medical therapy (OMT) alone. The aim of this study was to assess the cost-effectiveness of PCI+OMT compared with OMT alone. METHODS: REVIVED-BCIS2 was a prospective, multicenter UK trial, which randomized patients with severe ischemic left ventricular systolic dysfunction to either PCI+OMT or OMT alone. Health care resource use (including planned and unplanned revascularizations, medication, device implantation, and heart failure hospitalizations) and health outcomes data (EuroQol 5-dimension 5-level questionnaire) on each patient were collected at baseline and up to 8 years post-randomization. Resource use was costed using publicly available national unit costs. Within the trial, mean total costs and quality-adjusted life-years (QALYs) were estimated from the perspective of the UK health system. Cost-effectiveness was evaluated using estimated mean costs and QALYs in both groups. Regression analysis was used to adjust for clinically relevant predictors. RESULTS: Between 2013 and 2020, 700 patients were recruited (mean age: PCI+OMT=70 years, OMT=68 years; male (%): PCI+OMT=87, OMT=88); median follow-up was 3.4 years. Over all follow-ups, patients undergoing PCI yielded similar health benefits at higher costs compared with OMT alone (PCI+OMT: 4.14 QALYs, £22 352; OMT alone: 4.16 QALYs, £15 569; difference: −0.015, £6782). For both groups, most health resource consumption occurred in the first 2 years post-randomization. Probabilistic results showed that the probability of PCI being cost-effective was 0. CONCLUSIONS: A minimal difference in total QALYs was identified between arms, and PCI+OMT was not cost-effective compared with OMT, given its additional cost. A strategy of routine PCI to treat ischemic left ventricular systolic dysfunction does not seem to be a justifiable use of health care resources in the United Kingdom

    Arrhythmia and death following percutaneous revascularization in ischemic left ventricular dysfunction: Prespecified analyses from the REVIVED-BCIS2 trial

    Get PDF
    BACKGROUND: Ventricular arrhythmia is an important cause of mortality in patients with ischemic left ventricular dysfunction. Revascularization with coronary artery bypass graft or percutaneous coronary intervention is often recommended for these patients before implantation of a cardiac defibrillator because it is assumed that this may reduce the incidence of fatal and potentially fatal ventricular arrhythmias, although this premise has not been evaluated in a randomized trial to date. METHODS: Patients with severe left ventricular dysfunction, extensive coronary disease, and viable myocardium were randomly assigned to receive either percutaneous coronary intervention (PCI) plus optimal medical and device therapy (OMT) or OMT alone. The composite primary outcome was all-cause death or aborted sudden death (defined as an appropriate implantable cardioverter defibrillator therapy or a resuscitated cardiac arrest) at a minimum of 24 months, analyzed as time to first event on an intention-to-treat basis. Secondary outcomes included cardiovascular death or aborted sudden death, appropriate implantable cardioverter defibrillator (ICD) therapy or sustained ventricular arrhythmia, and number of appropriate ICD therapies. RESULTS: Between August 28, 2013, and March 19, 2020, 700 patients were enrolled across 40 centers in the United Kingdom. A total of 347 patients were assigned to the PCI+OMT group and 353 to the OMT alone group. The mean age of participants was 69 years; 88% were male; 56% had hypertension; 41% had diabetes; and 53% had a clinical history of myocardial infarction. The median left ventricular ejection fraction was 28%; 53.1% had an implantable defibrillator inserted before randomization or during follow-up. All-cause death or aborted sudden death occurred in 144 patients (41.6%) in the PCI group and 142 patients (40.2%) in the OMT group (hazard ratio, 1.03 [95% CI, 0.82–1.30]; P =0.80). There was no between-group difference in the occurrence of any of the secondary outcomes. CONCLUSIONS: PCI was not associated with a reduction in all-cause mortality or aborted sudden death. In patients with ischemic cardiomyopathy, PCI is not beneficial solely for the purpose of reducing potentially fatal ventricular arrhythmias. REGISTRATION: URL: https://www.clinicaltrials.gov ; Unique identifier: NCT01920048

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    Background: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. Methods: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. Findings: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96–1·28). Interpretation: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. Funding: National Institute for Health Research Health Services and Delivery Research Programme

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme

    Rehabilitation versus surgical reconstruction for non-acute anterior cruciate ligament injury (ACL SNNAP): a pragmatic randomised controlled trial

    Get PDF
    BackgroundAnterior cruciate ligament (ACL) rupture is a common debilitating injury that can cause instability of the knee. We aimed to investigate the best management strategy between reconstructive surgery and non-surgical treatment for patients with a non-acute ACL injury and persistent symptoms of instability.MethodsWe did a pragmatic, multicentre, superiority, randomised controlled trial in 29 secondary care National Health Service orthopaedic units in the UK. Patients with symptomatic knee problems (instability) consistent with an ACL injury were eligible. We excluded patients with meniscal pathology with characteristics that indicate immediate surgery. Patients were randomly assigned (1:1) by computer to either surgery (reconstruction) or rehabilitation (physiotherapy but with subsequent reconstruction permitted if instability persisted after treatment), stratified by site and baseline Knee Injury and Osteoarthritis Outcome Score—4 domain version (KOOS4). This management design represented normal practice. The primary outcome was KOOS4 at 18 months after randomisation. The principal analyses were intention-to-treat based, with KOOS4 results analysed using linear regression. This trial is registered with ISRCTN, ISRCTN10110685, and ClinicalTrials.gov, NCT02980367.FindingsBetween Feb 1, 2017, and April 12, 2020, we recruited 316 patients. 156 (49%) participants were randomly assigned to the surgical reconstruction group and 160 (51%) to the rehabilitation group. Mean KOOS4 at 18 months was 73·0 (SD 18·3) in the surgical group and 64·6 (21·6) in the rehabilitation group. The adjusted mean difference was 7·9 (95% CI 2·5–13·2; p=0·0053) in favour of surgical management. 65 (41%) of 160 patients allocated to rehabilitation underwent subsequent surgery according to protocol within 18 months. 43 (28%) of 156 patients allocated to surgery did not receive their allocated treatment. We found no differences between groups in the proportion of intervention-related complications.InterpretationSurgical reconstruction as a management strategy for patients with non-acute ACL injury with persistent symptoms of instability was clinically superior and more cost-effective in comparison with rehabilitation management

    Fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin with gemtuzumab ozogamicin improves event-free survival in younger patients with newly diagnosed aml and overall survival in patients with npm1 and flt3 mutations

    Get PDF
    Purpose To determine the optimal induction chemotherapy regimen for younger adults with newly diagnosed AML without known adverse risk cytogenetics. Patients and Methods One thousand thirty-three patients were randomly assigned to intensified (fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin [FLAG-Ida]) or standard (daunorubicin and Ara-C [DA]) induction chemotherapy, with one or two doses of gemtuzumab ozogamicin (GO). The primary end point was overall survival (OS). Results There was no difference in remission rate after two courses between FLAG-Ida + GO and DA + GO (complete remission [CR] + CR with incomplete hematologic recovery 93% v 91%) or in day 60 mortality (4.3% v 4.6%). There was no difference in OS (66% v 63%; P = .41); however, the risk of relapse was lower with FLAG-Ida + GO (24% v 41%; P < .001) and 3-year event-free survival was higher (57% v 45%; P < .001). In patients with an NPM1 mutation (30%), 3-year OS was significantly higher with FLAG-Ida + GO (82% v 64%; P = .005). NPM1 measurable residual disease (MRD) clearance was also greater, with 88% versus 77% becoming MRD-negative in peripheral blood after cycle 2 (P = .02). Three-year OS was also higher in patients with a FLT3 mutation (64% v 54%; P = .047). Fewer transplants were performed in patients receiving FLAG-Ida + GO (238 v 278; P = .02). There was no difference in outcome according to the number of GO doses, although NPM1 MRD clearance was higher with two doses in the DA arm. Patients with core binding factor AML treated with DA and one dose of GO had a 3-year OS of 96% with no survival benefit from FLAG-Ida + GO. Conclusion Overall, FLAG-Ida + GO significantly reduced relapse without improving OS. However, exploratory analyses show that patients with NPM1 and FLT3 mutations had substantial improvements in OS. By contrast, in patients with core binding factor AML, outcomes were excellent with DA + GO with no FLAG-Ida benefit

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
    • 

    corecore