78 research outputs found

    Recent updates and perspectives on approaches for the development of vaccines against visceral leishmaniasis

    Full text link
    All rights reserved. Visceral leishmaniasis (VL) is one of the most important tropical diseases worldwide. Although chemotherapy has been widely used to treat this disease, problems related to the development of parasite resistance and side effects associated with the compounds used have been noted. Hence, alternative approaches for VL control are desirable. Some methods, such as vector control and culling of infected dogs, are insufficiently effective, with the latter not ethically recommended. The development of vaccines to prevent VL is a feasible and desirable measure for disease control, for example, some vaccines designed to protect dogs against VL have recently been brought to market. These vaccines are based on the combination of parasite fractions or recombinant proteins with adjuvants that are able to induce cellular immune responses, however, their partial efficacy and the absence of a vaccine to protect against human leishmaniasis underline the need for characterization of new vaccine candidates. This review presents recent advances in control measures for VL based on vaccine development, describing extensively studied antigens, as well as new antigenic proteins recently identified using immuno-proteomic techniquesThis work was supported by grants from Instituto Nacional de Ciência e Tecnologia em Nano-Biofarmacêutica, Rede Nanobiotec/Brasil-Universidade Federal de Uberlândia/CAPES, PRONEX-FAPEMIG (APQ-01019-09), FAPEMIG (CBB-APQ-00819-12 and CBB-APQ-01778-2014), and CNPq (APQ-482976/2012-8, APQ-488237/2013-0, and APQ-467640/2014-9). EAFC and LRG are recipients of the grant from CNPq. MACF is the recipient of grants from FAPEMIG/CAPE

    Chitosan–Starch–Keratin composites: Improving thermo-mechanical and degradation properties through chemical modification

    Get PDF
    The lysozyme test shows an improved in the degradability rate, the weight loss of the films at 21 days is reduced from 73 % for chitosan-starch matrix up to 16 % for the composites with 5wt% of quill; but all films show a biodegradable character depending on keratin type and chemical modification. The outstanding properties related to the addition of treated keratin materials show that these natural composites are a remarkable alternative to potentiat-ing chitosan–starch films with sustainable featuresChitosan–starch polymers are reinforced with different keratin materials obtained from chicken feather. Keratin materials are treated with sodium hydroxide; the modified surfaces are rougher in comparison with untreated surfaces, observed by Scanning Electron Microscopy. The results obtained by Differential Scanning Calorimetry show an increase in the endothermic peak related to water evaporation of the films from 92 °C (matrix) up to 102–114 °C (reinforced composites). Glass transition temperature increases from 126 °C in the polymer matrix up to 170–200 °C for the composites. Additionally, the storage modulus in the composites is enhanced up to 1614 % for the composites with modified ground quill, 2522 % for composites with modified long fiber and 3206 % for the composites with modified short fiber. The lysozyme test shows an improved in the degradability rate, the weight loss of the films at 21 days is reduced from 73 % for chitosan-starch matrix up to 16 % for the composites with 5wt% of quill; but all films show a biodegradable character depending on keratin type and chemical modification. The outstanding properties related to the addition of treated keratin materials show that these natural composites are a remarkable alternative to potentiat-ing chitosan–starch films with sustainable featuresUniversidad Autónoma del Estado de México Tecnológico Nacional de México, Instituto Tecnológico de Querétaro Universidad Nacional Autónoma de México Tecnológico Nacional de México, Instituto Tecnológico de Celaya Universidad Autónoma de Cd. Juáre

    Evidence for Reductive Genome Evolution and Lateral Acquisition of Virulence Functions in Two Corynebacterium pseudotuberculosis Strains

    Get PDF
    Ruiz JC, D'Afonseca V, Silva A, et al. Evidence for Reductive Genome Evolution and Lateral Acquisition of Virulence Functions in Two Corynebacterium pseudotuberculosis Strains. PLoS ONE. 2011;6(4): e18551.Background: Corynebacterium pseudotuberculosis, a Gram-positive, facultative intracellular pathogen, is the etiologic agent of the disease known as caseous lymphadenitis (CL). CL mainly affects small ruminants, such as goats and sheep; it also causes infections in humans, though rarely. This species is distributed worldwide, but it has the most serious economic impact in Oceania, Africa and South America. Although C. pseudotuberculosis causes major health and productivity problems for livestock, little is known about the molecular basis of its pathogenicity. Methodology and Findings: We characterized two C. pseudotuberculosis genomes (Cp1002, isolated from goats; and CpC231, isolated from sheep). Analysis of the predicted genomes showed high similarity in genomic architecture, gene content and genetic order. When C. pseudotuberculosis was compared with other Corynebacterium species, it became evident that this pathogenic species has lost numerous genes, resulting in one of the smallest genomes in the genus. Other differences that could be part of the adaptation to pathogenicity include a lower GC content, of about 52%, and a reduced gene repertoire. The C. pseudotuberculosis genome also includes seven putative pathogenicity islands, which contain several classical virulence factors, including genes for fimbrial subunits, adhesion factors, iron uptake and secreted toxins. Additionally, all of the virulence factors in the islands have characteristics that indicate horizontal transfer. Conclusions: These particular genome characteristics of C. pseudotuberculosis, as well as its acquired virulence factors in pathogenicity islands, provide evidence of its lifestyle and of the pathogenicity pathways used by this pathogen in the infection process. All genomes cited in this study are available in the NCBI Genbank database (http://www.ncbi.nlm.nih.gov/genbank/) under accession numbers CP001809 and CP001829

    Global, regional, and national burden of neurological disorders during 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background Comparable data on the global and country-specific burden of neurological disorders and their trends are crucial for health-care planning and resource allocation. The Global Burden of Diseases, Injuries, and Risk Factors (GBD) Study provides such information but does not routinely aggregate results that are of interest to clinicians specialising in neurological conditions. In this systematic analysis, we quantified the global disease burden due to neurological disorders in 2015 and its relationship with country development level. Methods We estimated global and country-specific prevalence, mortality, disability-adjusted life-years (DALYs), years of life lost (YLLs), and years lived with disability (YLDs) for various neurological disorders that in the GBD classification have been previously spread across multiple disease groupings. The more inclusive grouping of neurological disorders included stroke, meningitis, encephalitis, tetanus, Alzheimer's disease and other dementias, Parkinson's disease, epilepsy, multiple sclerosis, motor neuron disease, migraine, tension-type headache, medication overuse headache, brain and nervous system cancers, and a residual category of other neurological disorders. We also analysed results based on the Socio-demographic Index (SDI), a compound measure of income per capita, education, and fertility, to identify patterns associated with development and how countries fare against expected outcomes relative to their level of development. Findings Neurological disorders ranked as the leading cause group of DALYs in 2015 (250·7 [95% uncertainty interval (UI) 229·1 to 274·7] million, comprising 10·2% of global DALYs) and the second-leading cause group of deaths (9·4 [9·1 to 9·7] million], comprising 16·8% of global deaths). The most prevalent neurological disorders were tension-type headache (1505·9 [UI 1337·3 to 1681·6 million cases]), migraine (958·8 [872·1 to 1055·6] million), medication overuse headache (58·5 [50·8 to 67·4 million]), and Alzheimer's disease and other dementias (46·0 [40·2 to 52·7 million]). Between 1990 and 2015, the number of deaths from neurological disorders increased by 36·7%, and the number of DALYs by 7·4%. These increases occurred despite decreases in age-standardised rates of death and DALYs of 26·1% and 29·7%, respectively; stroke and communicable neurological disorders were responsible for most of these decreases. Communicable neurological disorders were the largest cause of DALYs in countries with low SDI. Stroke rates were highest at middle levels of SDI and lowest at the highest SDI. Most of the changes in DALY rates of neurological disorders with development were driven by changes in YLLs. Interpretation Neurological disorders are an important cause of disability and death worldwide. Globally, the burden of neurological disorders has increased substantially over the past 25 years because of expanding population numbers and ageing, despite substantial decreases in mortality rates from stroke and communicable neurological disorders. The number of patients who will need care by clinicians with expertise in neurological conditions will continue to grow in coming decades. Policy makers and health-care providers should be aware of these trends to provide adequate services

    Global, regional, and national burden of stroke and its risk factors, 1990-2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Regularly updated data on stroke and its pathological types, including data on their incidence, prevalence, mortality, disability, risk factors, and epidemiological trends, are important for evidence-based stroke care planning and resource allocation. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) aims to provide a standardised and comprehensive measurement of these metrics at global, regional, and national levels. Methods: We applied GBD 2019 analytical tools to calculate stroke incidence, prevalence, mortality, disability-adjusted life-years (DALYs), and the population attributable fraction (PAF) of DALYs (with corresponding 95% uncertainty intervals [UIs]) associated with 19 risk factors, for 204 countries and territories from 1990 to 2019. These estimates were provided for ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage, and all strokes combined, and stratified by sex, age group, and World Bank country income level. Findings: In 2019, there were 12·2 million (95% UI 11·0–13·6) incident cases of stroke, 101 million (93·2–111) prevalent cases of stroke, 143 million (133–153) DALYs due to stroke, and 6·55 million (6·00–7·02) deaths from stroke. Globally, stroke remained the second-leading cause of death (11·6% [10·8–12·2] of total deaths) and the third-leading cause of death and disability combined (5·7% [5·1–6·2] of total DALYs) in 2019. From 1990 to 2019, the absolute number of incident strokes increased by 70·0% (67·0–73·0), prevalent strokes increased by 85·0% (83·0–88·0), deaths from stroke increased by 43·0% (31·0–55·0), and DALYs due to stroke increased by 32·0% (22·0–42·0). During the same period, age-standardised rates of stroke incidence decreased by 17·0% (15·0–18·0), mortality decreased by 36·0% (31·0–42·0), prevalence decreased by 6·0% (5·0–7·0), and DALYs decreased by 36·0% (31·0–42·0). However, among people younger than 70 years, prevalence rates increased by 22·0% (21·0–24·0) and incidence rates increased by 15·0% (12·0–18·0). In 2019, the age-standardised stroke-related mortality rate was 3·6 (3·5–3·8) times higher in the World Bank low-income group than in the World Bank high-income group, and the age-standardised stroke-related DALY rate was 3·7 (3·5–3·9) times higher in the low-income group than the high-income group. Ischaemic stroke constituted 62·4% of all incident strokes in 2019 (7·63 million [6·57–8·96]), while intracerebral haemorrhage constituted 27·9% (3·41 million [2·97–3·91]) and subarachnoid haemorrhage constituted 9·7% (1·18 million [1·01–1·39]). In 2019, the five leading risk factors for stroke were high systolic blood pressure (contributing to 79·6 million [67·7–90·8] DALYs or 55·5% [48·2–62·0] of total stroke DALYs), high body-mass index (34·9 million [22·3–48·6] DALYs or 24·3% [15·7–33·2]), high fasting plasma glucose (28·9 million [19·8–41·5] DALYs or 20·2% [13·8–29·1]), ambient particulate matter pollution (28·7 million [23·4–33·4] DALYs or 20·1% [16·6–23·0]), and smoking (25·3 million [22·6–28·2] DALYs or 17·6% [16·4–19·0]). Interpretation: The annual number of strokes and deaths due to stroke increased substantially from 1990 to 2019, despite substantial reductions in age-standardised rates, particularly among people older than 70 years. The highest age-standardised stroke-related mortality and DALY rates were in the World Bank low-income group. The fastest-growing risk factor for stroke between 1990 and 2019 was high body-mass index. Without urgent implementation of effective primary prevention strategies, the stroke burden will probably continue to grow across the world, particularly in low-income countries. Funding: Bill & Melinda Gates Foundation

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe
    corecore