94 research outputs found

    Food-borne disease and climate change in the United Kingdom

    Get PDF
    This review examined the likely impact of climate change upon food-borne disease in the UK using Campylobacter and Salmonella as example organisms. Campylobacter is an important food-borne disease and an increasing public health threat. There is a reasonable evidence base that the environment and weather play a role in its transmission to humans. However, uncertainty as to the precise mechanisms through which weather affects disease, make it difficult to assess the likely impact of climate change. There are strong positive associations between Salmonella cases and ambient temperature, and a clear understanding of the mechanisms behind this. However, because the incidence of Salmonella disease is declining in the UK, any climate change increases are likely to be small. For both Salmonella and Campylobacter the disease incidence is greatest in older adults and young children. There are many pathways through which climate change may affect food but only a few of these have been rigorously examined. This provides a high degree of uncertainty as to what the impacts of climate change will be. Food is highly controlled at the National and EU level. This provides the UK with resilience to climate change as well as potential to adapt to its consequences but it is unknown whether these are sufficient in the context of a changing climate

    Regional Differences in Presence of Shiga toxin-producing Escherichia coli Virulence-Associated Genes in the Environment in the North West and East Anglian regions of England

    Get PDF
    Shiga toxin-producing Escherichia coli is carried in the intestine of ruminant animals, and outbreaks have occurred after contact with ruminant animals or their environment. The presence of STEC virulence genes in the environment was investigated along recreational walking paths in the North West and East Anglia regions of England. In all, 720 boot sock samples from walkers’ shoes were collected between April 2013 and July 2014. Multiplex PCR was used to detect E. coli based on the amplification of the uidA gene and investigate STEC-associated virulence genes eaeA, stx1 and stx2. The eaeA virulence gene was detected in 45·5% of the samples, where stx1 and/or stx2 was detected in 12·4% of samples. There was a difference between the two regions sampled, with the North West exhibiting a higher proportion of positive boot socks for stx compared to East Anglia. In univariate analysis, ground conditions, river flow and temperature were associated with positive boot socks. The detection of stx genes in the soil samples suggests that STEC is present in the English countryside and individuals may be at risk for infection after outdoor activities even if there is no direct contact with animals. Significance and Impact of the Study: Several outbreaks within the UK have highlighted the danger of contracting Shiga toxin-producing Escherichia coli from contact with areas recently vacated by livestock. This is more likely to occur for STEC infections compared to other zoonotic bacteria given the low infectious dose required. While studies have determined the prevalence of STEC within farms and petting zoos, determining the risk to individuals enjoying recreational outdoor activities that occur near where livestock may be present is less researched. This study describes the prevalence with which stx genes, indicative of STEC bacteria, were found in the environment in the English countryside

    The effects of river flooding on dioxin and PCBs in beef

    Get PDF
    In 2008-2010, samples of meat from 40 beef cattle, along with grass, soil and commercial feed, taken from ten matched pairs of flood-prone and control farms, were analysed for PCDD/Fs and PCBs. Concentrations were higher in soil and grass from flood-prone farms. The beef samples from flood-prone farms had total TEQ levels about 20% higher than on control farms. A majority of flood-prone farms (7/10) had higher median levels in beef than on the corresponding control farm. This first controlled investigation into PCDD/F and PCB contamination in beef produced on flood-prone land, presents robust evidence that flooding is a contaminant transfer mechanism to cattle raised on river catchments with a history of urbanisation and industrialisation. PCDD/F and PCB sources in these river systems are likely to be a result of the legacy of contamination from previous industrialisation, as well as more recent combustion activity or pollution events. Crow

    Community use of facemasks and similar barriers to prevent respiratory illness such as COVID-19: A rapid scoping review

    Get PDF
    Background: Evidence for face-mask wearing in the community to protect against respiratory disease is unclear. Aim: To assess effectiveness of wearing face masks in the community to prevent respiratory disease, and recommend improvements to this evidence base. Methods: We systematically searched Scopus, Embase and MEDLINE for studies evaluating respiratory disease incidence after face-mask wearing (or not). Narrative synthesis and random-effects meta-analysis of attack rates for primary and secondary prevention were performed, subgrouped by design, setting, face barrier type, and who wore the mask. Preferred outcome was influenza-like illness. Grading of Recommendations, Assessment, Development and Evaluations (GRADE) quality assessment was undertaken and evidence base deficits described. Results: 33 studies (12 randomised control trials (RCTs)) were included. Mask wearing reduced primary infection by 6% (odds ratio (OR): 0.94; 95% CI: 0.75–1.19 for RCTs) to 61% (OR: 0.85; 95% CI: 0.32–2.27; OR: 0.39; 95% CI: 0.18–0.84 and OR: 0.61; 95% CI: 0.45–0.85 for cohort, case–control and cross-sectional studies respectively). RCTs suggested lowest secondary attack rates when both well and ill household members wore masks (OR: 0.81; 95% CI: 0.48–1.37). While RCTs might underestimate effects due to poor compliance and controls wearing masks, observational studies likely overestimate effects, as mask wearing might be associated with other risk-averse behaviours. GRADE was low or very low quality. Conclusion: Wearing face masks may reduce primary respiratory infection risk, probably by 6–15%. It is important to balance evidence from RCTs and observational studies when their conclusions widely differ and both are at risk of significant bias. COVID-19-specific studies are required

    Using infectious intestinal disease surveillance data to explore illness aetiology; a cryptosporidiosis case study.

    No full text
    Infectious intestinal disease (IID) surveillance data are an under-utilised information source on illness geography. This paper uses a case study of cryptosporidiosis in England and Wales to demonstrate how these data can be converted into area-based rates and the factors underlying illness geography investigated. Ascertainment bias is common in surveillance datasets, and we develop techniques to investigate and control this. Rural areas, locations with many livestock and localities with poor water treatment had elevated levels of cryptosporidiosis. These findings accord with previous research validating the techniques developed. Their use in future studies investigating IID geography is therefore recommended

    Application of kernel smoothing to estimate the spatio-temporal variation in risk of STEC O157 in England

    Get PDF
    Identifying geographical areas with significantly higher or lower rates of infectious diseases can provide important aetiological clues to inform the development of public health policy and interventions designed to reduce morbidity. We applied kernel smoothing to estimate the spatial and spatio-temporal variation in risk of STEC O157 infection in England between 2009 and 2015, and to explore differences between the residential locations of cases reporting travel and those not reporting travel. We provide evidence that the distribution of STEC O157 infection in England is non-uniform with respect to the distribution of the at-risk population; that the spatial distribution of the three main genetic lineages infecting humans (I, II and I/II) differs significantly and that the spatio-temporal risk is highly dynamic. Our results also indicate that cases of STEC O157 reporting travel within or outside the UK are more likely to live in the south/south-east of the country, meaning that their residential location may not reflect the location of exposure that led to their infection. We suggest that the observed variation in risk reflects exposure to sources of STEC O157 that are geographically prescribed. These differences may be related to a combination of changes in the strains circulating in the ruminant reservoir, animal movements (livestock, birds or wildlife) or the behavior of individuals prior to infection. Further work to identify the importance of behaviours and exposures reported by cases relative to residential location is needed

    Risk perception from the consumption of untreated drinking water in a small island community

    Get PDF
    A small island community in Malaysia uses gravity-fed drinking water, and rejected water treatment by the authorities. This study was conducted to evaluate the community's risk perception towards their untreated water supply by interviewing one adult per household in four out of eight villages on the island. The survey asked questions on risk perception, socioeconomic characteristics, and perception of water supply quality. Water samples were collected from a total of 24 sampling locations across the four villages, and 91.7% of them were positive for E.coli. The study surveyed 218 households and found that 61.5% of respondents agreed to some degree that the water is safe to drink without treatment, while 67.9% of respondents disagreed to some degree that drinking tap water is associated with health risks, and 73.3% of respondents agreed to some degree that it is safe to drink directly from taps that are fitted with water filters. Using factor analysis to group the risk perception questions and multivariable GLM to explore relationships with underlying factors, the study found that older respondents, lower income level, positive water odour perception and positive water supply reliability perception lowers risk perception. The village of residence also significantly affects the risk perception level in the model

    Modifiable Risk Factors for Common Ragweed (Ambrosia artemisiifolia) Allergy and Disease in Children: A Case-Control Study

    Get PDF
    Ragweed allergy is a major public health concern. Within Europe, ragweed is an introduced species and research has indicated that the amounts of ragweed pollen are likely to increase over Europe due to climate change, with corresponding increases in ragweed allergy. To address this threat, improving our understanding of predisposing factors for allergic sensitisation to ragweed and disease is necessary, specifically focusing upon factors that are potentially modifiable (i.e., environmental). In this study, a total of 4013 children aged 2–13 years were recruited across Croatia to undergo skin prick tests to determine sensitisation to ragweed and other aeroallergens. A parental questionnaire collected home environment, lifestyle, family and personal medical history, and socioeconomic information. Environmental variables were obtained using Geographical Information Systems and data from nearby pollen, weather, and air pollution stations. Logistic regression was performed (clustered on school) focusing on risk factors for allergic sensitisation and disease. Ragweed sensitisation was strongly associated with ragweed pollen at levels over 5000 grains m–3 year−1 and, above these levels, the risk of sensitisation was 12–16 times greater than in low pollen areas with about 400 grains m–3 year−1. Genetic factors were strongly associated with sensitisation but nearly all potentially modifiable factors were insignificant. This included measures of local land use and proximity to potential sources of ragweed pollen. Rural residence was protective (odds ratio (OR) 0.73, 95% confidence interval (CI) 0.55–0.98), but the factors underlying this association were unclear. Being sensitised to ragweed doubled (OR 2.17, 95% CI 1.59–2.96) the risk of rhinoconjunctivitis. No other potentially modifiable risk factors were associated with rhinoconjunctivitis. Ragweed sensitisation was strongly associated with ragweed pollen, and sensitisation was significantly associated with rhinoconjunctivitis. Apart from ragweed pollen levels, few other potentially modifiable factors were significantly associated with ragweed sensitisation. Hence, strategies to lower the risk of sensitisation should focus upon ragweed control

    Machine learning to refine decision making within a syndromic surveillance service

    Get PDF
    Background: Worldwide, syndromic surveillance is increasingly used for improved and timely situational awareness and early identification of public health threats. Syndromic data streams are fed into detection algorithms, which produce statistical alarms highlighting potential activity of public health importance. All alarms must be assessed to confirm whether they are of public health importance. In England, approximately 100 alarms are generated daily and, although their analysis is formalised through a risk assessment process, the process requires notable time, training, and maintenance of an expertise base to determine which alarms are of public health importance. The process is made more complicated by the observation that only 0.1% of statistical alarms are deemed to be of public health importance. Therefore, the aims of this study were to evaluate machine learning as a tool for computer-assisted human decision-making when assessing statistical alarms. Methods: A record of the risk assessment process was obtained from Public Health England for all 67505 statistical alarms between August 2013 and October 2015. This record contained information on the characteristics of the alarm (e.g. size, location). We used three Bayesian classifiers- naïve Bayes, tree-augmented naïve Bayes and Multinets - to examine the risk assessment record in England with respect to the final ‘Decision’ outcome made by an epidemiologist of ‘Alert’, ‘Monitor’ or ‘No-action’. Two further classifications based upon tree-augmented naïve Bayes and Multinets were implemented to account for the predominance of ‘No-action’ outcomes. Results: The attributes of each individual risk assessment were linked to the final decision made by an epidemiologist, providing confidence in the current process. The naïve Bayesian classifier performed best, correctly classifying 51.5% of ‘Alert’ outcomes. If the ‘Alert’ and ‘Monitor’ actions are combined then performance increases to 82.6% correctly classified. We demonstrate how a decision support system based upon a naïve Bayes classifier could be operationalised within an operational syndromic surveillance system. Conclusions: Within syndromic surveillance systems, machine learning techniques have the potential to make risk assessment following statistical alarms more automated, robust, and rigorous. However, our results also highlight the importance of specialist human input to the process
    • …
    corecore