30 research outputs found

    Nucleation of nitric acid hydrates in polar stratospheric clouds by meteoric material

    Get PDF
    Heterogeneous nucleation of crystalline nitric acid hydrates in polar stratospheric clouds (PSCs) enhances ozone depletion. However, the identity and mode of action of the particles responsible for nucleation remains unknown. It has been suggested that meteoric material may trigger nucleation of nitric acid trihydrate (NAT, or other nitric acid phases), but this has never been quantitatively demonstrated in the laboratory. Meteoric material is present in two forms in the stratosphere: smoke that results from the ablation and re-condensation of vapours, and fragments that result from the break-up of meteoroids entering the atmosphere. Here we show that analogues of both materials have a capacity to nucleate nitric acid hydrates. In combination with estimates from a global model of the amount of meteoric smoke and fragments in the polar stratosphere we show that meteoric material probably accounts for NAT observations in early season polar stratospheric clouds in the absence of water ice

    Heterogeneous Ice Nucleation by Soufriere Hills Volcanic Ash Immersed in Water Droplets

    Get PDF
    Fine particles of ash emitted during volcanic eruptions may sporadically influence cloud properties on a regional or global scale as well as influencing the dynamics of volcanic clouds and the subsequent dispersion of volcanic aerosol and gases. It has been shown that volcanic ash can trigger ice nucleation, but ash from relatively few volcanoes has been studied for its ice nucleating ability. In this study we quantify the efficiency with which ash from the Soufriere Hills volcano on Montserrat nucleates ice when immersed in supercooled water droplets. Using an ash sample from the 11th February 2010 eruption, we report ice nucleating efficiencies from 246 to 265 K. This wide range of temperatures was achieved using two separate droplet freezing instruments, one employing nanolitre droplets, the other using microlitre droplets. Soufriere Hills volcanic ash was significantly more efficient than all other ash samples that have been previously examined. At present the reasons for these differences are not understood, but may be related to mineralogy, amorphous content and surface chemistry

    Early Secreted Antigen ESAT-6 of Mycobacterium tuberculosis Promotes Protective T Helper 17 Cell Responses in a Toll-Like Receptor-2-dependent Manner

    Get PDF
    Despite its relatively poor efficacy, Bacillus Calmette-Guérin (BCG) has been used as a tuberculosis (TB) vaccine since its development in 1921. BCG induces robust T helper 1 (Th1) immune responses but, for many individuals, this is not sufficient for host resistance against Mycobacterium tuberculosis (M. tb) infection. Here we provide evidence that early secreted antigenic target protein 6 (ESAT-6), expressed by the virulent M. tb strain H37Rv but not by BCG, promotes vaccine-enhancing Th17 cell responses. These activities of ESAT-6 were dependent on TLR-2/MyD88 signalling and involved IL-6 and TGF-β production by dendritic cells. Thus, animals that were previously infected with H37Rv or recombinant BCG containing the RD1 region (BCG::RD1) exhibited improved protection upon re-challenge with virulent H37Rv compared with mice previously infected with BCG or RD1-deficient H37Rv (H37RvΔRD1). However, TLR-2 knockout (TLR-2-/-) animals neither showed Th17 responses nor exhibited improved protection in response to immunization with H37Rv. Furthermore, H37Rv and BCG::RD1 infection had little effect on the expression of the anti-inflammatory microRNA-146a (miR146a) in dendritic cells (DCs), whereas BCG and H37RvΔRD1 profoundly induced its expression in DCs. Consistent with these findings, ESAT-6 had no effect on miR146a expression in uninfected DCs, but dramatically inhibited its upregulation in BCG-infected or LPS-treated DCs. Collectively, our findings indicate that, in addition to Th1 immunity induced by BCG, RD1/ESAT-6-induced Th17 immune responses are essential for optimal vaccine efficacy

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Time is Money: Disentangling Higher Education Cost-Sharing and Commodification Through Deferred Graduate Retirement

    No full text
    Current higher education policy debates in Europe are increasingly focusing on raising the share of private funding. To date, policy proposals have centred on a relatively small number of alternatives, namely full public funding, tuition fees, either up-front or delayed and income-contingent, or a surtax on graduate incomes. Here, I present an alternative that, to my knowledge, has not been suggested previously, but sidesteps some important objections against other forms of private contributions. The basic idea explored here is to increase the statutory retirement age for higher education graduates relative to non-graduates. In principle, the resulting decrease in future public pension liabilities can be converted into increased funds for present spending on higher education. In this first discussion of the above proposal, I consider important caveats, perform an order-of-magnitude estimate of whether the financial implications of Deferred Graduate Retirement (DGR) are comparable to those of tuition fees, and discuss advantages and disadvantages compared to more established policy options. I conclude that, at least in the continental European context, DGR promises a number of economically and politically desirable properties compared to established alternatives, and deserves more serious investigation
    corecore