22 research outputs found
Rewriting the history of leishmaniasis in Sri Lanka: An untold story since 1904
Leishmaniasis is widely considered a disease that emerged in Sri Lanka in the 1990s. However, a comprehensive case report from 1904 suggests that the presence of Leishmaniasis was well demonstrated in Sri Lanka long before that. The Annual Administration Reports of Ceylon/Sri Lanka from 1895 to 1970 and the Ceylon Blue Book from 1821 to 1937 are official historical documents that provide an annual performance, progress, goals achieved, and finances of Sri Lanka during that time. Both these documents are available in the National Archives. The Ceylon Administrative Report of 1904 reports a full record of observation of Leishman-Donovan bodies in Sri Lanka for the first time. These reports contain a total of 33,438 cases of leishmaniasis in the years 1928 to 1938, 1953, 1956, 1957, 1959, 1960, and 1961 to 1962. Up to 1938, the term "cutaneous leishmaniasis" was used, and after 1938, the term "leishmaniasis" was used in these reports. "Kala-azar" was also mentioned in 11 administrative reports between 1900 and 1947. In 1947, an extensive vector study has been carried out where they reported kala-azar cases. This well-documented government health information clearly shows that the history of leishmaniasis is almost the same as the global history in which the first case with Leishman-Donovan bodies were reported in 1903. [Abstract copyright: Copyright: © 2022 Nuwangi et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The psychosocial burden of cutaneous leishmaniasis in rural Sri Lanka: A multi-method qualitative study
Leishmaniasis is a tropical infectious disease affecting some of the world’s most economically disadvantaged and resource-poor regions. Cutaneous leishmaniasis (CL) is the most common out of the three clinical types of Leishmaniasis. Since 1904 this disease has been endemic in Sri Lanka. CL is considered a disfiguring stigmatising disease with a higher psychosocial burden. However, there needs to be a more in-depth, holistic understanding of the psychosocial burden of this disease, both locally and internationally. An in-depth understanding of the disease burden beyond morbidity and mortality is required to provide people-centred care. We explored the psychosocial burden of CL in rural Sri Lanka using a complex multimethod qualitative approach with community engagement and involvement. Data collection included participant observation, an auto-ethnographic diary study by community researchers with post-diary interviews, and a Participant Experience Reflection Journal (PERJ) study with post-PERJ interviews with community members with CL. The thematic analysis revealed three major burden-related themes on perceptions and reflections on the disease: wound, treatment, and illness-experience related burden. Fear, disgust, body image concerns, and being subjected to negative societal reactions were wound-related. Treatment interfering with day-to-day life, pain, the time-consuming nature of the treatment, problems due to the ineffectiveness of the treatment, and the burden of attending a government hospital clinic were the treatment-related burdens. Anxiety/worry due to wrongly perceived disease severity and negative emotions due to the nature of the disease made the illness experience more burdensome. Addressing the multifaceted psychosocial burden is paramount to ensure healthcare seeking, treatment compliance, and disease control and prevention. We propose a people-centred healthcare model to understand the contextual nature of the disease and improve patient outcomes
Rewriting the history of leishmaniasis in Sri Lanka: An untold story since 1904
Leishmaniasis is widely considered a disease that emerged in Sri Lanka in the 1990s. However, a comprehensive case report from 1904 suggests that the presence of Leishmaniasis was well demonstrated in Sri Lanka long before that. The Annual Administration Reports of Ceylon/Sri Lanka from 1895 to 1970 and the Ceylon Blue Book from 1821 to 1937 are official historical documents that provide an annual performance, progress, goals achieved, and finances of Sri Lanka during that time. Both these documents are available in the National Archives. The Ceylon Administrative Report of 1904 reports a full record of observation of Leishman-Donovan bodies in Sri Lanka for the first time. These reports contain a total of 33,438 cases of leishmaniasis in the years 1928 to 1938, 1953, 1956, 1957, 1959, 1960, and 1961 to 1962. Up to 1938, the term “cutaneous leishmaniasis” was used, and after 1938, the term “leishmaniasis” was used in these reports. “Kala-azar” was also mentioned in 11 administrative reports between 1900 and 1947. In 1947, an extensive vector study has been carried out where they reported kala-azar cases. This well-documented government health information clearly shows that the history of leishmaniasis is almost the same as the global history in which the first case with Leishman-Donovan bodies were reported in 1903
Stigma associated with cutaneous and mucocutaneous leishmaniasis: A systematic review
Background
Cutaneous (CL) and mucocutaneous leishmaniasis (MCL) are parasitic diseases caused by parasites of the genus leishmania leading to stigma caused by disfigurations. This study aimed to systematically review the dimensions, measurement methods, implications, and potential interventions done to reduce the CL- and MCL- associated stigma, synthesising the current evidence according to an accepted stigma framework.
Methods
This systematic review followed the PRISMA guidelines and was registered in PROSPERO (ID- CRD42021274925). The eligibility criteria included primary articles discussing stigma associated with CL and MCL published in English, Spanish, or Portuguese up to January 2023. An electronic search was conducted in Medline, Embase, Scopus, PubMed, EBSCO, Web of Science, Global Index Medicus, Trip, and Cochrane Library. The mixed methods appraisal tool (MMAT) was used for quality checking. A narrative synthesis was conducted to summarise the findings.
Results
A total of 16 studies were included. The studies report the cognitive, affective, and behavioural reactions associated with public stigma. Cognitive reactions included misbeliefs about the disease transmission and treatment, and death. Affective reactions encompass emotions like disgust and shame, often triggered by the presence of scars. Behavioural reactions included avoidance, discrimination, rejection, mockery, and disruptions of interpersonal relationships. The review also highlights self-stigma manifestations, including enacted, internalised, and felt stigma. Enacted stigma manifested as barriers to forming proper interpersonal relationships, avoidance, isolation, and perceiving CL lesions/scars as marks of shame. Felt stigma led to experiences of marginalisation, rejection, mockery, disruptions of interpersonal relationships, the anticipation of discrimination, fear of social stigmatisation, and facing disgust. Internalised stigma affected self-identity and caused psychological distress.
Conclusions
There are various manifestations of stigma associated with CL and MCL. This review highlights the lack of knowledge on the structural stigma associated with CL, the lack of stigma interventions and the need for a unique stigma tool to measure stigma associated with CL and MCL
A primary human T-cell spectral library to facilitate large scale quantitative T-cell proteomics.
Data independent analysis (DIA) exemplified by sequential window acquisition of all theoretical mass spectra (SWATH-MS) provides robust quantitative proteomics data, but the lack of a public primary human T-cell spectral library is a current resource gap. Here, we report the generation of a high-quality spectral library containing data for 4,833 distinct proteins from human T-cells across genetically unrelated donors, covering ~24% proteins of the UniProt/SwissProt reviewed human proteome. SWATH-MS analysis of 18 primary T-cell samples using the new human T-cell spectral library reliably identified and quantified 2,850 proteins at 1% false discovery rate (FDR). In comparison, the larger Pan-human spectral library identified and quantified 2,794 T-cell proteins in the same dataset. As the libraries identified an overlapping set of proteins, combining the two libraries resulted in quantification of 4,078 human T-cell proteins. Collectively, this large data archive will be a useful public resource for human T-cell proteomic studies. The human T-cell library is available at SWATHAtlas and the data are available via ProteomeXchange (PXD019446 and PXD019542) and PeptideAtlas (PASS01587)
Nitrous oxide and nitric oxide fluxes differ from tea plantation and tropical forest soils after nitrogen addition
South Asia is experiencing a rapid increase in nitrogen (N) pollution which is predicted to continue in the future. One of the possible implications is an increase in gaseous reactive N losses from soil, notably in the form of nitrous oxide (N2O) and nitric oxide (NO). Current knowledge of N2O and NO dynamics in forest ecosystems is not sufficient to understand and mitigate the impacts on climate and air quality. In order to improve the understanding of emissions from two major land uses in Sri Lanka, we investigated the emission potential for N2O and NO fluxes measured by absorption spectroscopy and chemiluminescence, respectively, in response to three different N addition levels (the equivalent of 0, 40 and 100 kg N ha−1 yr.−1 deposition in the form of NH4+) from soils of two typical land uses in Sri Lanka: a secondary montane tropical forest and a tea plantation using soil laboratory incubations of repacked soil cores. We observed an increase in NO fluxes which was directly proportional to the amount of N applied in line with initial expectations (maximum flux ranging from 6–8 ng NO-N g−1 d−1 and from 16–68 ng NO-N g−1 d−1 in forest and tea plantation soils, respectively). However, fluxes of N2O did not show a clear response to N addition, the highest treatment (100 N) did not result in the highest fluxes. Moreover, fluxes of N2O were higher following the addition of a source of carbon (in the form of glucose) across treatment levels and both land uses (maximum flux of 2–34 ng N2O-N g−1 d−1 in forest and 808–3,939 ng N2O-N g−1 d−1 in tea plantation soils). Both N2O and NO fluxes were higher from tea plantation soils compared to forest soils irrespective of treatment level, thus highlighting the importance of land use and land management for gaseous reactive N fluxes and therefore N dynamics
Estimation of ammonia deposition to forest ecosystems in Scotland and Sri Lanka using wind-controlled NH3 enhancement experiments
Ammonia (NH3) pollution has emerged as a major cause of concern as atmospheric concentrations continue to increase globally. Environmentally damaging NH3 levels are expected to severely affect sensitive and economically important organisms, but evidence is lacking in many parts of the world. We describe the design and operation of a wind-controlled NH3 enhancement system to assess effects on forests in two contrasting climates. We established structurally identical NH3 enhancement systems in a temperate birch woodland in the UK and a tropical sub-montane forest in central Sri Lanka, both simulating real-world NH3 pollution conditions. Vertical and horizontal NH3 concentrations were monitored at two different time scales to understand NH3 transport within the forest canopies. We applied a bi-directional resistance model with four canopy layers to calculate net deposition fluxes. At both sites, NH3 concentrations and deposition were found to decrease exponentially with distance away from the source, consistent with expectations. Conversely, we found differences in vertical mixing of NH3 between the two experiments, with more vertically uniform NH3 concentrations in the dense and multi-layered sub-montane forest canopy in Sri Lanka. Monthly NH3 concentrations downwind of the source ranged from 3 to 29 μg m−3 at the UK site and 2–47 μg m−3 at the Sri Lankan site, compared with background values of 0.63 and 0.35 μg m−3, respectively. The total calculated NH3 dry deposition flux to all the canopy layers along the NH3 transects ranged from 12 to 162 kg N ha−1 yr−1 in the UK and 16–426 kg N ha−1 yr−1 in Sri Lanka, representative of conditions in the vicinity of a range of common NH3 sources. This multi-layer model is applicable for identifying the fate of NH3 in forest ecosystems where the gas enters the canopy laterally through the trunk space and exposes the understorey to high NH3 levels. In both study sites, we found that cuticular deposition was the dominant flux in the vegetation layers, with a smaller contribution from stomatal uptake. The new facilities are now allowing the first ever field comparison of NH3 impacts on forest ecosystems, with special focus on lichen bio-indicators, which will provide vital evidence to inform NH3 critical levels and associated nitrogen policy development in South Asia
Global burden and strength of evidence for 88 risk factors in 204 countries and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
Background: Understanding the health consequences associated with exposure to risk factors is necessary to inform public health policy and practice. To systematically quantify the contributions of risk factor exposures to specific health outcomes, the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 aims to provide comprehensive estimates of exposure levels, relative health risks, and attributable burden of disease for 88 risk factors in 204 countries and territories and 811 subnational locations, from 1990 to 2021. Methods: The GBD 2021 risk factor analysis used data from 54 561 total distinct sources to produce epidemiological estimates for 88 risk factors and their associated health outcomes for a total of 631 risk–outcome pairs. Pairs were included on the basis of data-driven determination of a risk–outcome association. Age-sex-location-year-specific estimates were generated at global, regional, and national levels. Our approach followed the comparative risk assessment framework predicated on a causal web of hierarchically organised, potentially combinative, modifiable risks. Relative risks (RRs) of a given outcome occurring as a function of risk factor exposure were estimated separately for each risk–outcome pair, and summary exposure values (SEVs), representing risk-weighted exposure prevalence, and theoretical minimum risk exposure levels (TMRELs) were estimated for each risk factor. These estimates were used to calculate the population attributable fraction (PAF; ie, the proportional change in health risk that would occur if exposure to a risk factor were reduced to the TMREL). The product of PAFs and disease burden associated with a given outcome, measured in disability-adjusted life-years (DALYs), yielded measures of attributable burden (ie, the proportion of total disease burden attributable to a particular risk factor or combination of risk factors). Adjustments for mediation were applied to account for relationships involving risk factors that act indirectly on outcomes via intermediate risks. Attributable burden estimates were stratified by Socio-demographic Index (SDI) quintile and presented as counts, age-standardised rates, and rankings. To complement estimates of RR and attributable burden, newly developed burden of proof risk function (BPRF) methods were applied to yield supplementary, conservative interpretations of risk–outcome associations based on the consistency of underlying evidence, accounting for unexplained heterogeneity between input data from different studies. Estimates reported represent the mean value across 500 draws from the estimate's distribution, with 95% uncertainty intervals (UIs) calculated as the 2·5th and 97·5th percentile values across the draws. Findings: Among the specific risk factors analysed for this study, particulate matter air pollution was the leading contributor to the global disease burden in 2021, contributing 8·0% (95% UI 6·7–9·4) of total DALYs, followed by high systolic blood pressure (SBP; 7·8% [6·4–9·2]), smoking (5·7% [4·7–6·8]), low birthweight and short gestation (5·6% [4·8–6·3]), and high fasting plasma glucose (FPG; 5·4% [4·8–6·0]). For younger demographics (ie, those aged 0–4 years and 5–14 years), risks such as low birthweight and short gestation and unsafe water, sanitation, and handwashing (WaSH) were among the leading risk factors, while for older age groups, metabolic risks such as high SBP, high body-mass index (BMI), high FPG, and high LDL cholesterol had a greater impact. From 2000 to 2021, there was an observable shift in global health challenges, marked by a decline in the number of all-age DALYs broadly attributable to behavioural risks (decrease of 20·7% [13·9–27·7]) and environmental and occupational risks (decrease of 22·0% [15·5–28·8]), coupled with a 49·4% (42·3–56·9) increase in DALYs attributable to metabolic risks, all reflecting ageing populations and changing lifestyles on a global scale. Age-standardised global DALY rates attributable to high BMI and high FPG rose considerably (15·7% [9·9–21·7] for high BMI and 7·9% [3·3–12·9] for high FPG) over this period, with exposure to these risks increasing annually at rates of 1·8% (1·6–1·9) for high BMI and 1·3% (1·1–1·5) for high FPG. By contrast, the global risk-attributable burden and exposure to many other risk factors declined, notably for risks such as child growth failure and unsafe water source, with age-standardised attributable DALYs decreasing by 71·5% (64·4–78·8) for child growth failure and 66·3% (60·2–72·0) for unsafe water source. We separated risk factors into three groups according to trajectory over time: those with a decreasing attributable burden, due largely to declining risk exposure (eg, diet high in trans-fat and household air pollution) but also to proportionally smaller child and youth populations (eg, child and maternal malnutrition); those for which the burden increased moderately in spite of declining risk exposure, due largely to population ageing (eg, smoking); and those for which the burden increased considerably due to both increasing risk exposure and population ageing (eg, ambient particulate matter air pollution, high BMI, high FPG, and high SBP). Interpretation: Substantial progress has been made in reducing the global disease burden attributable to a range of risk factors, particularly those related to maternal and child health, WaSH, and household air pollution. Maintaining efforts to minimise the impact of these risk factors, especially in low SDI locations, is necessary to sustain progress. Successes in moderating the smoking-related burden by reducing risk exposure highlight the need to advance policies that reduce exposure to other leading risk factors such as ambient particulate matter air pollution and high SBP. Troubling increases in high FPG, high BMI, and other risk factors related to obesity and metabolic syndrome indicate an urgent need to identify and implement interventions
Recommended from our members
Global burden of 288 causes of death and life expectancy decomposition in 204 countries and territories and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
BACKGROUND Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING Bill & Melinda Gates Foundation
Placing Leishmaniasis in the limelight through the communicable disease surveillance system: An experience from Sri Lanka
Having an effective surveillance system is imperative to take timely and appropriate actions for disease control and prevention. In Sri Lanka, leishmaniasis was declared as a notifiable disease in 2008. This paper presents a comprehensive compilation of the up-to-date documents on the communicable disease and leishmaniasis surveillance in Sri Lanka in order to describe the importance of the existing leishmaniasis surveillance system and to identify gaps that need to be addressed. The documents perused included circulars, reports, manuals, guidelines, ordinances, presentations, and published articles. The disease trends reported were linked to important landmarks in leishmaniasis surveillance. The findings suggest that there is a well-established surveillance system in Sri Lanka having a massive impact on increased case detection, resulting in im-proved attention on leishmaniasis. However, the system is not without its short comings and there is room for further improvements