54 research outputs found

    Inhibition of PbGP43 expression may suggest that gp43 is a virulence factor in Paracoccidioides brasiliensis

    Get PDF
    ABSTARCT: Glycoprotein gp43 is an immunodominant diagnostic antigen for paracoccidioidomycosis caused by Paracoccidioides brasiliensis. It is abundantly secreted in isolates such as Pb339. It is structurally related to beta-1,3-exoglucanases, however inactive. Its function in fungal biology is unknown, but it elicits humoral, innate and protective cellular immune responses; it binds to extracellular matrix-associated proteins. In this study we applied an antisense RNA (aRNA) technology and Agrobacterium tumefaciens-mediated transformation to generate mitotically stable PbGP43 mutants (PbGP43 aRNA) derived from wild type Pb339 to study its role in P. brasiliensis biology and during infection. Control PbEV was transformed with empty vector. Growth curve, cell vitality and morphology of PbGP43 aRNA mutants were indistinguishable from those of controls. PbGP43 expression was reduced 80-85% in mutants 1 and 2, as determined by real time PCR, correlating with a massive decrease in gp43 expression. This was shown by immunoblotting of culture supernatants revealed with anti-gp43 mouse monoclonal and rabbit polyclonal antibodies, and also by affinity-ligand assays of extracellular molecules with laminin and fibronectin. In vitro, there was significantly increased TNF-α production and reduced yeast recovery when PbGP43 aRNA1 was exposed to IFN-γ-stimulated macrophages, suggesting reduced binding/uptake and/or increased killing. In vivo, fungal burden in lungs of BALB/c mice infected with silenced mutant was negligible and associated with decreased lung ΙΛ-10 and IL-6. Therefore, our results correlated low gp43 expression with lower pathogenicity in mice, but that will be definitely proven when PbGP43 knockouts become available.

    Epidemiology of Invasive Fungal Infections in Latin America

    Get PDF
    The pathogenic role of invasive fungal infections (IFIs) has increased during the past two decades in Latin America and worldwide, and the number of patients at risk has risen dramatically. Working habits and leisure activities have also been a focus of attention by public health officials, as endemic mycoses have provoked a number of outbreaks. An extensive search of medical literature from Latin America suggests that the incidence of IFIs from both endemic and opportunistic fungi has increased. The increase in endemic mycoses is probably related to population changes (migration, tourism, and increased population growth), whereas the increase in opportunistic mycoses may be associated with the greater number of people at risk. In both cases, the early and appropriate use of diagnostic procedures has improved diagnosis and outcome

    The impact of migration on tuberculosis epidemiology and control in high-income countries: a review.

    Get PDF
    Tuberculosis (TB) causes significant morbidity and mortality in high-income countries with foreign-born individuals bearing a disproportionate burden of the overall TB case burden in these countries. In this review of tuberculosis and migration we discuss the impact of migration on the epidemiology of TB in low burden countries, describe the various screening strategies to address this issue, review the yield and cost-effectiveness of these programs and describe the gaps in knowledge as well as possible future solutions.The reasons for the TB burden in the migrant population are likely to be the reactivation of remotely-acquired latent tuberculosis infection (LTBI) following migration from low/intermediate-income high TB burden settings to high-income, low TB burden countries.TB control in high-income countries has historically focused on the early identification and treatment of active TB with accompanying contact-tracing. In the face of the TB case-load in migrant populations, however, there is ongoing discussion about how best to identify TB in migrant populations. In general, countries have generally focused on two methods: identification of active TB (either at/post-arrival or increasingly pre-arrival in countries of origin) and secondly, conditionally supported by WHO guidance, through identifying LTBI in migrants from high TB burden countries. Although health-economic analyses have shown that TB control in high income settings would benefit from providing targeted LTBI screening and treatment to certain migrants from high TB burden countries, implementation issues and barriers such as sub-optimal treatment completion will need to be addressed to ensure program efficacy

    Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17

    Get PDF
    Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Prevalence and attributable health burden of chronic respiratory diseases, 1990–2017: a systematic analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background Previous attempts to characterise the burden of chronic respiratory diseases have focused only on specific disease conditions, such as chronic obstructive pulmonary disease (COPD) or asthma. In this study, we aimed to characterise the burden of chronic respiratory diseases globally, providing a comprehensive and up-to-date analysis on geographical and time trends from 1990 to 2017. Methods Using data from the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017, we estimated the prevalence, morbidity, and mortality attributable to chronic respiratory diseases through an analysis of deaths, disability-adjusted life-years (DALYs), and years of life lost (YLL) by GBD super-region, from 1990 to 2017, stratified by age and sex. Specific diseases analysed included asthma, COPD, interstitial lung disease and pulmonary sarcoidosis, pneumoconiosis, and other chronic respiratory diseases. We also assessed the contribution of risk factors (smoking, second-hand smoke, ambient particulate matter and ozone pollution, household air pollution from solid fuels, and occupational risks) to chronic respiratory disease-attributable DALYs. Findings In 2017, 544·9 million people (95% uncertainty interval [UI] 506·9–584·8) worldwide had a chronic respiratory disease, representing an increase of 39·8% compared with 1990. Chronic respiratory disease prevalence showed wide variability across GBD super-regions, with the highest prevalence among both males and females in high-income regions, and the lowest prevalence in sub-Saharan Africa and south Asia. The age-sex-specific prevalence of each chronic respiratory disease in 2017 was also highly variable geographically. Chronic respiratory diseases were the third leading cause of death in 2017 (7·0% [95% UI 6·8–7·2] of all deaths), behind cardiovascular diseases and neoplasms. Deaths due to chronic respiratory diseases numbered 3 914 196 (95% UI 3 790 578–4 044 819) in 2017, an increase of 18·0% since 1990, while total DALYs increased by 13·3%. However, when accounting for ageing and population growth, declines were observed in age-standardised prevalence (14·3% decrease), age-standardised death rates (42·6%), and age-standardised DALY rates (38·2%). In males and females, most chronic respiratory disease-attributable deaths and DALYs were due to COPD. In regional analyses, mortality rates from chronic respiratory diseases were greatest in south Asia and lowest in sub-Saharan Africa, also across both sexes. Notably, although absolute prevalence was lower in south Asia than in most other super-regions, YLLs due to chronic respiratory diseases across the subcontinent were the highest in the world. Death rates due to interstitial lung disease and pulmonary sarcoidosis were greater than those due to pneumoconiosis in all super-regions. Smoking was the leading risk factor for chronic respiratory disease-related disability across all regions for men. Among women, household air pollution from solid fuels was the predominant risk factor for chronic respiratory diseases in south Asia and sub-Saharan Africa, while ambient particulate matter represented the leading risk factor in southeast Asia, east Asia, and Oceania, and in the Middle East and north Africa super-region. Interpretation Our study shows that chronic respiratory diseases remain a leading cause of death and disability worldwide, with growth in absolute numbers but sharp declines in several age-standardised estimators since 1990. Premature mortality from chronic respiratory diseases seems to be highest in regions with less-resourced health systems on a per-capita basis. Funding Bill & Melinda Gates Foundation

    Single-dose administration and the influence of the timing of the booster dose on immunogenicity and efficacy of ChAdOx1 nCoV-19 (AZD1222) vaccine: a pooled analysis of four randomised trials.

    Get PDF
    BACKGROUND: The ChAdOx1 nCoV-19 (AZD1222) vaccine has been approved for emergency use by the UK regulatory authority, Medicines and Healthcare products Regulatory Agency, with a regimen of two standard doses given with an interval of 4-12 weeks. The planned roll-out in the UK will involve vaccinating people in high-risk categories with their first dose immediately, and delivering the second dose 12 weeks later. Here, we provide both a further prespecified pooled analysis of trials of ChAdOx1 nCoV-19 and exploratory analyses of the impact on immunogenicity and efficacy of extending the interval between priming and booster doses. In addition, we show the immunogenicity and protection afforded by the first dose, before a booster dose has been offered. METHODS: We present data from three single-blind randomised controlled trials-one phase 1/2 study in the UK (COV001), one phase 2/3 study in the UK (COV002), and a phase 3 study in Brazil (COV003)-and one double-blind phase 1/2 study in South Africa (COV005). As previously described, individuals 18 years and older were randomly assigned 1:1 to receive two standard doses of ChAdOx1 nCoV-19 (5 × 1010 viral particles) or a control vaccine or saline placebo. In the UK trial, a subset of participants received a lower dose (2·2 × 1010 viral particles) of the ChAdOx1 nCoV-19 for the first dose. The primary outcome was virologically confirmed symptomatic COVID-19 disease, defined as a nucleic acid amplification test (NAAT)-positive swab combined with at least one qualifying symptom (fever ≥37·8°C, cough, shortness of breath, or anosmia or ageusia) more than 14 days after the second dose. Secondary efficacy analyses included cases occuring at least 22 days after the first dose. Antibody responses measured by immunoassay and by pseudovirus neutralisation were exploratory outcomes. All cases of COVID-19 with a NAAT-positive swab were adjudicated for inclusion in the analysis by a masked independent endpoint review committee. The primary analysis included all participants who were SARS-CoV-2 N protein seronegative at baseline, had had at least 14 days of follow-up after the second dose, and had no evidence of previous SARS-CoV-2 infection from NAAT swabs. Safety was assessed in all participants who received at least one dose. The four trials are registered at ISRCTN89951424 (COV003) and ClinicalTrials.gov, NCT04324606 (COV001), NCT04400838 (COV002), and NCT04444674 (COV005). FINDINGS: Between April 23 and Dec 6, 2020, 24 422 participants were recruited and vaccinated across the four studies, of whom 17 178 were included in the primary analysis (8597 receiving ChAdOx1 nCoV-19 and 8581 receiving control vaccine). The data cutoff for these analyses was Dec 7, 2020. 332 NAAT-positive infections met the primary endpoint of symptomatic infection more than 14 days after the second dose. Overall vaccine efficacy more than 14 days after the second dose was 66·7% (95% CI 57·4-74·0), with 84 (1·0%) cases in the 8597 participants in the ChAdOx1 nCoV-19 group and 248 (2·9%) in the 8581 participants in the control group. There were no hospital admissions for COVID-19 in the ChAdOx1 nCoV-19 group after the initial 21-day exclusion period, and 15 in the control group. 108 (0·9%) of 12 282 participants in the ChAdOx1 nCoV-19 group and 127 (1·1%) of 11 962 participants in the control group had serious adverse events. There were seven deaths considered unrelated to vaccination (two in the ChAdOx1 nCov-19 group and five in the control group), including one COVID-19-related death in one participant in the control group. Exploratory analyses showed that vaccine efficacy after a single standard dose of vaccine from day 22 to day 90 after vaccination was 76·0% (59·3-85·9). Our modelling analysis indicated that protection did not wane during this initial 3-month period. Similarly, antibody levels were maintained during this period with minimal waning by day 90 (geometric mean ratio [GMR] 0·66 [95% CI 0·59-0·74]). In the participants who received two standard doses, after the second dose, efficacy was higher in those with a longer prime-boost interval (vaccine efficacy 81·3% [95% CI 60·3-91·2] at ≥12 weeks) than in those with a short interval (vaccine efficacy 55·1% [33·0-69·9] at <6 weeks). These observations are supported by immunogenicity data that showed binding antibody responses more than two-fold higher after an interval of 12 or more weeks compared with an interval of less than 6 weeks in those who were aged 18-55 years (GMR 2·32 [2·01-2·68]). INTERPRETATION: The results of this primary analysis of two doses of ChAdOx1 nCoV-19 were consistent with those seen in the interim analysis of the trials and confirm that the vaccine is efficacious, with results varying by dose interval in exploratory analyses. A 3-month dose interval might have advantages over a programme with a short dose interval for roll-out of a pandemic vaccine to protect the largest number of individuals in the population as early as possible when supplies are scarce, while also improving protection after receiving a second dose. FUNDING: UK Research and Innovation, National Institutes of Health Research (NIHR), The Coalition for Epidemic Preparedness Innovations, the Bill & Melinda Gates Foundation, the Lemann Foundation, Rede D'Or, the Brava and Telles Foundation, NIHR Oxford Biomedical Research Centre, Thames Valley and South Midland's NIHR Clinical Research Network, and AstraZeneca

    Worldwide trends in hypertension prevalence and progress in treatment and control from 1990 to 2019: a pooled analysis of 1201 population-representative studies with 104 million participants

    Get PDF
    Background Hypertension can be detected at the primary health-care level and low-cost treatments can effectively control hypertension. We aimed to measure the prevalence of hypertension and progress in its detection, treatment, and control from 1990 to 2019 for 200 countries and territories. Methods We used data from 1990 to 2019 on people aged 30–79 years from population-representative studies with measurement of blood pressure and data on blood pressure treatment. We defined hypertension as having systolic blood pressure 140 mm Hg or greater, diastolic blood pressure 90 mm Hg or greater, or taking medication for hypertension. We applied a Bayesian hierarchical model to estimate the prevalence of hypertension and the proportion of people with hypertension who had a previous diagnosis (detection), who were taking medication for hypertension (treatment), and whose hypertension was controlled to below 140/90 mm Hg (control). The model allowed for trends over time to be non-linear and to vary by age. Findings The number of people aged 30–79 years with hypertension doubled from 1990 to 2019, from 331 (95% credible interval 306–359) million women and 317 (292–344) million men in 1990 to 626 (584–668) million women and 652 (604–698) million men in 2019, despite stable global age-standardised prevalence. In 2019, age-standardised hypertension prevalence was lowest in Canada and Peru for both men and women; in Taiwan, South Korea, Japan, and some countries in western Europe including Switzerland, Spain, and the UK for women; and in several low-income and middle-income countries such as Eritrea, Bangladesh, Ethiopia, and Solomon Islands for men. Hypertension prevalence surpassed 50% for women in two countries and men in nine countries, in central and eastern Europe, central Asia, Oceania, and Latin America. Globally, 59% (55–62) of women and 49% (46–52) of men with hypertension reported a previous diagnosis of hypertension in 2019, and 47% (43–51) of women and 38% (35–41) of men were treated. Control rates among people with hypertension in 2019 were 23% (20–27) for women and 18% (16–21) for men. In 2019, treatment and control rates were highest in South Korea, Canada, and Iceland (treatment >70%; control >50%), followed by the USA, Costa Rica, Germany, Portugal, and Taiwan. Treatment rates were less than 25% for women and less than 20% for men in Nepal, Indonesia, and some countries in sub-Saharan Africa and Oceania. Control rates were below 10% for women and men in these countries and for men in some countries in north Africa, central and south Asia, and eastern Europe. Treatment and control rates have improved in most countries since 1990, but we found little change in most countries in sub-Saharan Africa and Oceania. Improvements were largest in high-income countries, central Europe, and some upper-middle-income and recently high-income countries including Costa Rica, Taiwan, Kazakhstan, South Africa, Brazil, Chile, Turkey, and Iran. Interpretation Improvements in the detection, treatment, and control of hypertension have varied substantially across countries, with some middle-income countries now outperforming most high-income nations. The dual approach of reducing hypertension prevalence through primary prevention and enhancing its treatment and control is achievable not only in high-income countries but also in low-income and middle-income settings. Funding WHO
    corecore