32 research outputs found

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Single-dose administration and the influence of the timing of the booster dose on immunogenicity and efficacy of ChAdOx1 nCoV-19 (AZD1222) vaccine: a pooled analysis of four randomised trials.

    Get PDF
    BACKGROUND: The ChAdOx1 nCoV-19 (AZD1222) vaccine has been approved for emergency use by the UK regulatory authority, Medicines and Healthcare products Regulatory Agency, with a regimen of two standard doses given with an interval of 4-12 weeks. The planned roll-out in the UK will involve vaccinating people in high-risk categories with their first dose immediately, and delivering the second dose 12 weeks later. Here, we provide both a further prespecified pooled analysis of trials of ChAdOx1 nCoV-19 and exploratory analyses of the impact on immunogenicity and efficacy of extending the interval between priming and booster doses. In addition, we show the immunogenicity and protection afforded by the first dose, before a booster dose has been offered. METHODS: We present data from three single-blind randomised controlled trials-one phase 1/2 study in the UK (COV001), one phase 2/3 study in the UK (COV002), and a phase 3 study in Brazil (COV003)-and one double-blind phase 1/2 study in South Africa (COV005). As previously described, individuals 18 years and older were randomly assigned 1:1 to receive two standard doses of ChAdOx1 nCoV-19 (5 × 1010 viral particles) or a control vaccine or saline placebo. In the UK trial, a subset of participants received a lower dose (2·2 × 1010 viral particles) of the ChAdOx1 nCoV-19 for the first dose. The primary outcome was virologically confirmed symptomatic COVID-19 disease, defined as a nucleic acid amplification test (NAAT)-positive swab combined with at least one qualifying symptom (fever ≥37·8°C, cough, shortness of breath, or anosmia or ageusia) more than 14 days after the second dose. Secondary efficacy analyses included cases occuring at least 22 days after the first dose. Antibody responses measured by immunoassay and by pseudovirus neutralisation were exploratory outcomes. All cases of COVID-19 with a NAAT-positive swab were adjudicated for inclusion in the analysis by a masked independent endpoint review committee. The primary analysis included all participants who were SARS-CoV-2 N protein seronegative at baseline, had had at least 14 days of follow-up after the second dose, and had no evidence of previous SARS-CoV-2 infection from NAAT swabs. Safety was assessed in all participants who received at least one dose. The four trials are registered at ISRCTN89951424 (COV003) and ClinicalTrials.gov, NCT04324606 (COV001), NCT04400838 (COV002), and NCT04444674 (COV005). FINDINGS: Between April 23 and Dec 6, 2020, 24 422 participants were recruited and vaccinated across the four studies, of whom 17 178 were included in the primary analysis (8597 receiving ChAdOx1 nCoV-19 and 8581 receiving control vaccine). The data cutoff for these analyses was Dec 7, 2020. 332 NAAT-positive infections met the primary endpoint of symptomatic infection more than 14 days after the second dose. Overall vaccine efficacy more than 14 days after the second dose was 66·7% (95% CI 57·4-74·0), with 84 (1·0%) cases in the 8597 participants in the ChAdOx1 nCoV-19 group and 248 (2·9%) in the 8581 participants in the control group. There were no hospital admissions for COVID-19 in the ChAdOx1 nCoV-19 group after the initial 21-day exclusion period, and 15 in the control group. 108 (0·9%) of 12 282 participants in the ChAdOx1 nCoV-19 group and 127 (1·1%) of 11 962 participants in the control group had serious adverse events. There were seven deaths considered unrelated to vaccination (two in the ChAdOx1 nCov-19 group and five in the control group), including one COVID-19-related death in one participant in the control group. Exploratory analyses showed that vaccine efficacy after a single standard dose of vaccine from day 22 to day 90 after vaccination was 76·0% (59·3-85·9). Our modelling analysis indicated that protection did not wane during this initial 3-month period. Similarly, antibody levels were maintained during this period with minimal waning by day 90 (geometric mean ratio [GMR] 0·66 [95% CI 0·59-0·74]). In the participants who received two standard doses, after the second dose, efficacy was higher in those with a longer prime-boost interval (vaccine efficacy 81·3% [95% CI 60·3-91·2] at ≥12 weeks) than in those with a short interval (vaccine efficacy 55·1% [33·0-69·9] at <6 weeks). These observations are supported by immunogenicity data that showed binding antibody responses more than two-fold higher after an interval of 12 or more weeks compared with an interval of less than 6 weeks in those who were aged 18-55 years (GMR 2·32 [2·01-2·68]). INTERPRETATION: The results of this primary analysis of two doses of ChAdOx1 nCoV-19 were consistent with those seen in the interim analysis of the trials and confirm that the vaccine is efficacious, with results varying by dose interval in exploratory analyses. A 3-month dose interval might have advantages over a programme with a short dose interval for roll-out of a pandemic vaccine to protect the largest number of individuals in the population as early as possible when supplies are scarce, while also improving protection after receiving a second dose. FUNDING: UK Research and Innovation, National Institutes of Health Research (NIHR), The Coalition for Epidemic Preparedness Innovations, the Bill & Melinda Gates Foundation, the Lemann Foundation, Rede D'Or, the Brava and Telles Foundation, NIHR Oxford Biomedical Research Centre, Thames Valley and South Midland's NIHR Clinical Research Network, and AstraZeneca

    A multi-country test of brief reappraisal interventions on emotions during the COVID-19 pandemic.

    Get PDF
    The COVID-19 pandemic has increased negative emotions and decreased positive emotions globally. Left unchecked, these emotional changes might have a wide array of adverse impacts. To reduce negative emotions and increase positive emotions, we tested the effectiveness of reappraisal, an emotion-regulation strategy that modifies how one thinks about a situation. Participants from 87 countries and regions (n = 21,644) were randomly assigned to one of two brief reappraisal interventions (reconstrual or repurposing) or one of two control conditions (active or passive). Results revealed that both reappraisal interventions (vesus both control conditions) consistently reduced negative emotions and increased positive emotions across different measures. Reconstrual and repurposing interventions had similar effects. Importantly, planned exploratory analyses indicated that reappraisal interventions did not reduce intentions to practice preventive health behaviours. The findings demonstrate the viability of creating scalable, low-cost interventions for use around the world

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Technical considerations for the generation of novel pseudotyped viruses

    Get PDF
    A pseudotyped virus (PV) is a virus particle with an envelope protein originating from a different virus. The ability to dictate which envelope proteins are expressed on the surface has made pseudotyping an important tool for basic virological studies such as determining the cellular targets of the envelope protein of the virus as well as identification of potential antiviral compounds and measuring specific antibody responses. In this review, we describe the common methodologies employed to generate PVs, with a focus on approaches to improve the efficacy of PV generation

    Using geospatial modelling to optimize the rollout of antiretroviral-based pre-exposure HIV interventions in Sub-Saharan Africa

    No full text
    Antiretroviral-based pre-exposure HIV interventions may soon be rolled out in resource-constrained Sub-Saharan African countries, but rollout plans have yet to be designed. Here we use geospatial modeling and optimization techniques to compare two rollout plans for ARV-based microbicides in South Africa: a utilitarian plan that minimizes incidence by using geographic targeting, and an egalitarian plan that maximizes geographic equity in access to interventions. We find significant geographic variation in the efficiency of interventions in reducing HIV transmission, and that efficiency increases disproportionately with increasing incidence. The utilitarian plan would result in considerable geographic inequity in access to interventions, but (by exploiting geographic variation in incidence) could prevent ~40% more infections than the egalitarian plan. Our results show that the geographic resource allocation decisions made at the beginning of a rollout, and the location where the rollout is initiated, will be crucial in determining the success of interventions in reducing HIV epidemics

    Rhizome, root/sediment interactions, aerenchyma and internal pressure changes in seagrasses

    Full text link
    © Springer International Publishing AG, part of Springer Nature 2018. Life in seawater presents several challenges for seagrasses owing to low O 2 and CO 2 solubility and slow gas diffusion rates. Seagrasses have evolved numerous adaptations to these environmental conditions including porous tissue providing low-resistance internal gas channels (aerenchyma) and carbon concentration mechanisms involving the enzyme carbonic anhydrase. Moreover, seagrasses grow in reduced, anoxic sediments, and aerobic metabolism in roots and rhizomes therefore has to be sustained via rapid O 2 transport through the aerenchyma. Tissue aeration is driven by internal concentration gradients between leaves and belowground tissues, where the leaves are the source of O 2 and the rhizomes and roots function as O 2 sinks. Inadequate internal aeration e.g., due to low O 2 availability in the surrounding water during night time, can lead to sulphide intrusion into roots and rhizomes, which has been linked to enhanced seagrass mortality. Under favourable conditions, however, seagrasses leak O 2 and dissolved organic carbon into the rhizosphere, where it maintains oxic microzones protecting the plant against reduced phytotoxic compounds and generates dynamic chemical microgradients that modulate the rhizosphere microenvironment. Local radial O 2 loss from belowground tissues of seagrasses leads to sulphide oxidation in the rhizosphere, which generates protons and results in local acidification. Such low-pH microniches can lead to dissolution of carbonates and protolytic phosphorus solubilisation in carbonate-rich sediments. The seagrass rhizosphere is also characterised by numerous high-pH microniches indicative of local stimulation of proton consuming microbial processes such as sulphate reduction via root/rhizome exudates and/or release of alkaline substances. High sediment pH shifts the sulphide speciation away from H 2 S towards non-tissue-penetrating HS - ions, which can alleviate the belowground tissue exposure to phytotoxic H 2 S. High sulphide production can also lead to iron and phosphorus mobilization through sulphide-induced reduction of insoluble Fe(III)oxyhydroxides to dissolved Fe(II) with concomitant phosphorus release to the porewater. Adequate internal tissue aeration is thus of vital importance for seagrasses as it ensures aerobic metabolism in distal parts of the roots and provides protection against intrusion of phytotoxins from the surrounding sediment
    corecore