130 research outputs found
Health status of returning refugees, internally displaced persons, and the host community in a post-conflict district in northern Sri Lanka: a cross-sectional survey.
BACKGROUND: Although the adverse impacts of conflict-driven displacement on health are well-documented, less is known about how health status and associated risk factors differ according to displacement experience. This study quantifies health status and quality of life among returning refugees, internally displaced persons, and the host community in a post-conflict district in Northern Sri Lanka, and explores associated risk factors. METHODS: We analysed data collected through a household survey (n = 570) in Vavuniya district, Sri Lanka. The effect of displacement status and other risk factors on perceived quality of life as estimated from the 36-item Short Form Questionnaire, mental health status from 9-item Patient Health Questionnaire, and self-reported chronic disease status were examined using univariable analyses and multivariable regressions. RESULTS: We found strong evidence that perceived quality of life was significantly lower for internally displaced persons than for the host community and returning refugees, after adjusting for covariates. Both mental health status and chronic disease status did not vary remarkably among the groups, suggesting that other risk factors might be more important determinants of these outcomes. CONCLUSIONS: Our study provides important insights into the overall health and well-being of the different displaced sub-populations in a post-conflict setting. Findings reinforce existing evidence on the relationship between displacement and health but also highlight gaps in research on the long-term health effects of prolonged displacement. Understanding the heterogeneity of conflict-affected populations has important implications for effective and equitable humanitarian service delivery in a post-conflict setting
Environmental exposure to metallic soil elements and risk of cancer in the UK population, using a unique linkage between THIN and BGS databases
Background: There have been many epidemiological studies into the influence of exposure to the most toxic elements on the risk of cancer in the workplace, mainly due to the exposure of certain occupational groups, or perhaps in populations near industrial sources. Toxic elements include arsenic, copper, nickel, and uranium; and many more of these elements have been shown to increase the risk of several different types of cancers in these highly-exposed groups. Many of these elements naturally exist in the soil, and the health impact of these levels of environmental exposures on the general population has received little attention to date possibly due to the belief that soil concentrations of these elements are too low to cause harm to the general population. Therefore, the long-term effect of such chronic exposure to metals in the soil remains unclear.
Aims and objectives: The goals are to utilise a new resource known as THIN-GBASE for conducting a series of environmental epidemiological studies to test the hypothesis that BCC, lung and GIT cancers are associated with high exposure to certain low-level metals in soil. We sought to use this resource in determining which soil metals should be tested for predicting each of the cancer outcomes.
Methods: For BCC, an ecological study was initially undertaken to assess the overall regional variation in BCC to provide national and contemporary breakdowns of incidence rates across the UK. The primary exposure of interest for BCC was low-level soil arsenic, and we therefore quantified soil arsenic exposure levels based on the UK national safety limits for arsenic [i.e. As-C4SLs = 35 mg/kg]. A population-based cohort study was conducted to quantify the risks associated between the development of BCC and increasing levels of exposure to soil arsenic. For lung cancer, a two-stage process was adopted: 1) data mining analysis using the correlation-based filter selection model was used to find the restricted set of soil metals were best predictors for lung cancer; and 2) a prospective cohort study was use where these sets of elements were fitted together (adjusted for confounding variables) in a multivariable Cox proportional-hazards model to determine the risks associated between the development of lung cancer, with increasing levels of exposure to each specific element. For GIT cancers, a three-stage process was adopted: stages 1 and 2 used a similar methodology for the lung cancer study. In stage 3, all GIT cancers were divided into three broader outcomes i.e. upper GIT (includes mouth & oesophagus), stomach (as standalone) and colorectal (includes small, large, rectum and anal canal) cancers. A multivariate competing risk survival model was adjusted for the three different GIT cancers as competing events to identify associations between any of the selected group of metals found in stage 1 and GIT-specific cancers.
Results: For BCC, the findings for the ecological study show that overall EASRs & WASRs for BCC in the UK was 98.6 and 66.9 per 100,000 person-years, respectively. It indicates a large geographical variation in age-sex standardised incidence of BCC with the South East having the highest incidence of BCC (202.7/100,000 person-years), followed by South Central (193.5/100,000 person-years) and Wales (185.7/100,000 person-years). Incidence rates of BCC were substantially higher in the least socioeconomically deprived groups. It was observed that increasing levels of deprivation led to a decreased rate of BCC (p < 0.001). In terms of age groups, the largest annual increase was observed among those aged 30-49 years. Assessment for soil arsenic indicated that individuals living in areas with concentrations ≥35mg/kg significantly had an increased hazard of developing BCC (35-70mg/kg: adjusted HR 1.08, 95% CI: 1.02-1.14; ≥70mg/kg: adjusted HR 1.17, 95% CI: 1.09-1.28). Urban residents with the highest exposure of soil arsenic had the greatest risk of developing BCC (≥ 70.0 mg/kg: HR 1.18, 95% CI: 1.06-1.36). For lung cancer, the correlation-based filter selection model identified aluminium, lead and uranium as the appropriate set of exposures for modelling lung cancer risk. Complete adjustments of hazards model showed evidence of an increased risk of developing lung cancer with elevated concentrations for only soil aluminium at medium levels ranging between 47,000-61,600mg/kg. Urban residents with the highest exposure of soil aluminium had the greatest risk of developing lung cancer (≥ 61,600mg/kg: HR 1.12, 95% CI: 1.04-1.22). For GIT cancers, the correlation-based filter selection model identified seven elements i.e. aluminium, phosphorus, zinc, uranium, calcium, manganese, and lead, as the appropriate set of exposures for predicting GIT cancer risk. The complete adjustment for hazards model indicated that the risk of developing overall GIT cancers were significantly associated with elevated exposure levels of soil phosphorus only (873-1,127mg/kg: HR 1.08, 95% CI: 1.02-1.14; 1,127-1,456mg/kg: HR 1.07, 95% CI: 1.01-1.13; and ≥1,145mg/kg: HR 1.07, 95% CI: 1.01-1.13). There were no consistent relationships identified between any of the selected groups of elements and the GIT-specific cancer outcomes when adjusting for different GIT cancers as competing events.
Conclusion: There appears to be slight evidence of BCC, respiratory and GIT cancer risk with elevated exposure to soil arsenic, aluminium and phosphorus, respectively. The series of investigations conducted for this research are one of the first, if not, contemporary UK-based study to present novel estimates for a group of ill-defined pollutants. This research demonstrates that linking geochemical data with electronic primary care medical records can be a valuable approach of proving whether long term exposure to low-level soil contaminants may have a health consequence in the population
Environmental exposure to metallic soil elements and risk of cancer in the UK population, using a unique linkage between THIN and BGS databases
Background: There have been many epidemiological studies into the influence of exposure to the most toxic elements on the risk of cancer in the workplace, mainly due to the exposure of certain occupational groups, or perhaps in populations near industrial sources. Toxic elements include arsenic, copper, nickel, and uranium; and many more of these elements have been shown to increase the risk of several different types of cancers in these highly-exposed groups. Many of these elements naturally exist in the soil, and the health impact of these levels of environmental exposures on the general population has received little attention to date possibly due to the belief that soil concentrations of these elements are too low to cause harm to the general population. Therefore, the long-term effect of such chronic exposure to metals in the soil remains unclear.
Aims and objectives: The goals are to utilise a new resource known as THIN-GBASE for conducting a series of environmental epidemiological studies to test the hypothesis that BCC, lung and GIT cancers are associated with high exposure to certain low-level metals in soil. We sought to use this resource in determining which soil metals should be tested for predicting each of the cancer outcomes.
Methods: For BCC, an ecological study was initially undertaken to assess the overall regional variation in BCC to provide national and contemporary breakdowns of incidence rates across the UK. The primary exposure of interest for BCC was low-level soil arsenic, and we therefore quantified soil arsenic exposure levels based on the UK national safety limits for arsenic [i.e. As-C4SLs = 35 mg/kg]. A population-based cohort study was conducted to quantify the risks associated between the development of BCC and increasing levels of exposure to soil arsenic. For lung cancer, a two-stage process was adopted: 1) data mining analysis using the correlation-based filter selection model was used to find the restricted set of soil metals were best predictors for lung cancer; and 2) a prospective cohort study was use where these sets of elements were fitted together (adjusted for confounding variables) in a multivariable Cox proportional-hazards model to determine the risks associated between the development of lung cancer, with increasing levels of exposure to each specific element. For GIT cancers, a three-stage process was adopted: stages 1 and 2 used a similar methodology for the lung cancer study. In stage 3, all GIT cancers were divided into three broader outcomes i.e. upper GIT (includes mouth & oesophagus), stomach (as standalone) and colorectal (includes small, large, rectum and anal canal) cancers. A multivariate competing risk survival model was adjusted for the three different GIT cancers as competing events to identify associations between any of the selected group of metals found in stage 1 and GIT-specific cancers.
Results: For BCC, the findings for the ecological study show that overall EASRs & WASRs for BCC in the UK was 98.6 and 66.9 per 100,000 person-years, respectively. It indicates a large geographical variation in age-sex standardised incidence of BCC with the South East having the highest incidence of BCC (202.7/100,000 person-years), followed by South Central (193.5/100,000 person-years) and Wales (185.7/100,000 person-years). Incidence rates of BCC were substantially higher in the least socioeconomically deprived groups. It was observed that increasing levels of deprivation led to a decreased rate of BCC (p < 0.001). In terms of age groups, the largest annual increase was observed among those aged 30-49 years. Assessment for soil arsenic indicated that individuals living in areas with concentrations ≥35mg/kg significantly had an increased hazard of developing BCC (35-70mg/kg: adjusted HR 1.08, 95% CI: 1.02-1.14; ≥70mg/kg: adjusted HR 1.17, 95% CI: 1.09-1.28). Urban residents with the highest exposure of soil arsenic had the greatest risk of developing BCC (≥ 70.0 mg/kg: HR 1.18, 95% CI: 1.06-1.36). For lung cancer, the correlation-based filter selection model identified aluminium, lead and uranium as the appropriate set of exposures for modelling lung cancer risk. Complete adjustments of hazards model showed evidence of an increased risk of developing lung cancer with elevated concentrations for only soil aluminium at medium levels ranging between 47,000-61,600mg/kg. Urban residents with the highest exposure of soil aluminium had the greatest risk of developing lung cancer (≥ 61,600mg/kg: HR 1.12, 95% CI: 1.04-1.22). For GIT cancers, the correlation-based filter selection model identified seven elements i.e. aluminium, phosphorus, zinc, uranium, calcium, manganese, and lead, as the appropriate set of exposures for predicting GIT cancer risk. The complete adjustment for hazards model indicated that the risk of developing overall GIT cancers were significantly associated with elevated exposure levels of soil phosphorus only (873-1,127mg/kg: HR 1.08, 95% CI: 1.02-1.14; 1,127-1,456mg/kg: HR 1.07, 95% CI: 1.01-1.13; and ≥1,145mg/kg: HR 1.07, 95% CI: 1.01-1.13). There were no consistent relationships identified between any of the selected groups of elements and the GIT-specific cancer outcomes when adjusting for different GIT cancers as competing events.
Conclusion: There appears to be slight evidence of BCC, respiratory and GIT cancer risk with elevated exposure to soil arsenic, aluminium and phosphorus, respectively. The series of investigations conducted for this research are one of the first, if not, contemporary UK-based study to present novel estimates for a group of ill-defined pollutants. This research demonstrates that linking geochemical data with electronic primary care medical records can be a valuable approach of proving whether long term exposure to low-level soil contaminants may have a health consequence in the population
Positive and negative emotions during the COVID-19 pandemic: A longitudinal survey study of the UK population
The COVID-19 pandemic has had a profound impact on society; it changed the way we work, learn, socialise, and move throughout the world. In the United Kingdom, policies such as business closures, travel restrictions, and social distance mandates were implemented to slow the spread of COVID-19 and implemented and relaxed intermittently throughout the response period. While negative emotions and feelings such as distress and anxiety during this time of crisis were to be expected, we also see the signs of human resilience, including positive feelings like determination, pride, and strength. A longitudinal study using online survey tools was conducted to assess people’s changing moods during the pandemic in the UK. The Positive and Negative Affect Schedule (PANAS) was used to measure self-reported feelings and emotions throughout six periods (phases) of the study from March 2020 to July 2021. A total of 4,222 respondents participated in the survey, while a sub-group participated in each of the six study phases (n = 167). The results were analysed using a cross-sectional study design for the full group across each study phase, while prospective cohort analysis was used to assess the subset of participants who voluntarily answered the survey in each of the six study phases (n = 167). Gender, age and employment status were found to be most significant to PANAS score, with older people, retirees, and women generally reporting more positive moods, while young people and unemployed people generally reported lower positive scores and higher negative scores, indicating more negative emotions. Additionally, it was found that people generally reported higher positive feelings in the summer of 2021, which may be related to the relaxation of COVID-19-related policies in the UK as well as the introduction of vaccines for the general population. This study is an important investigation into what allows for positivity during a crisis and gives insights into periods or groups that may be vulnerable to increased negative states of emotions and feelings
Analysis of environmental factors influence on endemic cholera risks in sub-Saharan Africa
The recurring cholera outbreaks in sub-Saharan Africa are of growing concern, especially considering the potential acceleration in the global trend of larger and more lethal cholera outbreaks due to the impacts of climate change. However, there is a scarcity of evidence-based research addressing the environmental and infrastructure factors that sustain cholera recurrence in Africa. This study adopts a statistical approach to investigate over two decades of endemic cholera outbreaks and their relationship with five environmental factors: water provision, sanitation provision, raising temperatures, increased rainfall and GDP. The analysis covers thirteen of the forty-two countries in the mainland sub-Saharan region, collectively representing one-third of the region's territory and half of its population. This breadth enables the findings to be generalised at a regional level. Results from all analyses consistently associate water provision with cholera reduction. The stratified model links increased water provision with a reduction in cholera risk that ranged from 4.2 % to 84.1 % among eight countries (out of 13 countries) as well as a reduction of such risk that ranged from 9.8 % to 68.9 % when there is increased sanitation provision, which was observed in nine countries (out of 13). These results indicate that the population's limited access to water and sanitation, as well as the rise in temperatures, are critical infrastructure and environmental factors contributing to endemic cholera and the heightened risk of outbreaks across the sub-Saharan region. Therefore, these are key areas for targeted interventions and cross-border collaboration to enhance resilience to outbreaks and lead to the end of endemic cholera in the region. However, it is important to interpret the results of this study with caution; therefore, further investigation is recommended to conduct a more detailed analysis of the impact of infrastructure and environmental factors on reducing cholera risk
Call detail record aggregation methodology impacts infectious disease models informed by human mobility
This paper demonstrates how two different methods used to calculate population-level mobility from Call Detail Records (CDR) produce varying predictions of the spread of epidemics informed by these data. Our findings are based on one CDR dataset describing inter-district movement in Ghana in 2021, produced using two different aggregation methodologies. One methodology, "all pairs," is designed to retain long distance network connections while the other, "sequential" methodology is designed to accurately reflect the volume of travel between locations. We show how the choice of methodology feeds through models of human mobility to the predictions of a metapopulation SEIR model of disease transmission. We also show that this impact varies depending on the location of pathogen introduction and the transmissibility of infections. For central locations or highly transmissible diseases, we do not observe significant differences between aggregation methodologies on the predicted spread of disease. For less transmissible diseases or those introduced into remote locations, we find that the choice of aggregation methodology influences the speed of spatial spread as well as the size of the peak number of infections in individual districts. Our findings can help researchers and users of epidemiological models to understand how methodological choices at the level of model inputs may influence the results of models of infectious disease transmission, as well as the circumstances in which these choices do not alter model predictions
Resource availability and capacity to implement multi-stranded cholera interventions in the north-east region of Nigeria
Background:
Limited healthcare facility (HCF) resources and capacity to implement multi-stranded cholera interventions (water, sanitation, and hygiene (WASH), surveillance, case management, and community engagement) can hinder the actualisation of the global strategic roadmap goals for cholera control, especially in settings made fragile by armed conflicts, such as the north-east region of Nigeria. Therefore, we aimed to assess HCF resource availability and capacity to implement these cholera interventions in Adamawa and Bauchi States in Nigeria as well as assess their coordination in both states and Abuja where national coordination of cholera is based.
Methods:
We conducted a cross-sectional survey using a face-to-face structured questionnaire to collect data on multi-stranded cholera interventions and their respective indicators in HCFs. We generated scores to describe the resource availability of each cholera intervention and categorised them as follows: 0–50 (low), 51–70 (moderate), 71–90 (high), and over 90 (excellent). Further, we defined an HCF with a high capacity to implement a cholera intervention as one with a score equal to or above the average intervention score.
Results:
One hundred and twenty HCFs (55 in Adamawa and 65 in Bauchi) were surveyed in March 2021, most of which were primary healthcare centres (83%; 99/120). In both states, resource availability for WASH indicators had high to excellent median scores; surveillance and community engagement indicators had low median scores. Median resource availability scores for case management indicators ranged from low to moderate. Coordination of cholera interventions in Adamawa State and Abuja was high but low in Bauchi State. Overall, HCF capacity to implement multi-stranded cholera interventions was high, though higher in Adamawa State than in Bauchi State.
Conclusions:
The study found a marked variation in HCF resource availability and capacity within locations and by cholera interventions and identified cholera interventions that should be prioritised for strengthening as surveillance and laboratory, case management, and community engagement. The findings support adopting a differential approach to strengthening cholera interventions for better preparedness and response to cholera outbreaks
Impact of water and sanitation services on cholera outbreaks in sub-Saharan Africa
While most parts of the world seem to have controlled cholera, the sub-Saharan African region is still suffering with the cholera outbreaks and struggling to restrain its incidence. Recent research attributes eighty three percent of cholera deaths between 2000 and 2015 to the sub-Saharan region. Poor water, sanitation and hygiene (WASH) services can be among the main risk factors contributing to the public health burden of cholera. Humans living in close proximity to one another in environments with poor hygiene conditions and little access to clean water is an explanation for how cholera takes root in non-coastal areas. The combination of these factors with the vulnerability of surface and groundwater resources to faecal contamination can favour onset and propagation of outbreaks. This study investigated the correlation between cholera rates per population and lack of basic services of drinking water and sanitation in the sub-Saharan African countries, where incident cases of cholera have been regularly reported to the World Health Organization (WHO) since 1991
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
Coalescing disparate data sources for the geospatial prediction of mosquito abundance, using Brazil as a motivating case study
One of the barriers to performing geospatial surveillance of mosquito occupancy or infestation anywhere in the world is the paucity of primary entomologic survey data geolocated at a residential property level and matched to important risk factor information (e.g., anthropogenic, environmental, and climate) that enables the spatial risk prediction of mosquito occupancy or infestation. Such data are invaluable pieces of information for academics, policy makers, and public health program managers operating in low-resource settings in Africa, Latin America, and Southeast Asia, where mosquitoes are typically endemic. The reality is that such data remain elusive in these low-resource settings and, where available, high-quality data that include both individual and spatial characteristics to inform the geospatial description and risk patterning of infestation remain rare. There are many online sources of open-source spatial data that are reliable and can be used to address such data paucity in this context. Therefore, the aims of this article are threefold: (1) to highlight where these reliable open-source data can be acquired and how they can be used as risk factors for making spatial predictions for mosquito occupancy in general; (2) to use Brazil as a case study to demonstrate how these datasets can be combined to predict the presence of arboviruses through the use of ecological niche modeling using the maximum entropy algorithm; and (3) to discuss the benefits of using bespoke applications beyond these open-source online data sources, demonstrating for how they can be the new “gold-standard” approach for gathering primary entomologic survey data. The scope of this article was mainly limited to a Brazilian context because it builds on an existing partnership with academics and stakeholders from environmental surveillance agencies in the states of Pernambuco and Paraiba. The analysis presented in this article was also limited to a specific mosquito species, i.e., Aedes aegypti, due to its endemic status in Brazil
- …
