15 research outputs found

    Machine learning to refine decision making within a syndromic surveillance service

    Get PDF
    Background: Worldwide, syndromic surveillance is increasingly used for improved and timely situational awareness and early identification of public health threats. Syndromic data streams are fed into detection algorithms, which produce statistical alarms highlighting potential activity of public health importance. All alarms must be assessed to confirm whether they are of public health importance. In England, approximately 100 alarms are generated daily and, although their analysis is formalised through a risk assessment process, the process requires notable time, training, and maintenance of an expertise base to determine which alarms are of public health importance. The process is made more complicated by the observation that only 0.1% of statistical alarms are deemed to be of public health importance. Therefore, the aims of this study were to evaluate machine learning as a tool for computer-assisted human decision-making when assessing statistical alarms. Methods: A record of the risk assessment process was obtained from Public Health England for all 67505 statistical alarms between August 2013 and October 2015. This record contained information on the characteristics of the alarm (e.g. size, location). We used three Bayesian classifiers- naïve Bayes, tree-augmented naïve Bayes and Multinets - to examine the risk assessment record in England with respect to the final ‘Decision’ outcome made by an epidemiologist of ‘Alert’, ‘Monitor’ or ‘No-action’. Two further classifications based upon tree-augmented naïve Bayes and Multinets were implemented to account for the predominance of ‘No-action’ outcomes. Results: The attributes of each individual risk assessment were linked to the final decision made by an epidemiologist, providing confidence in the current process. The naïve Bayesian classifier performed best, correctly classifying 51.5% of ‘Alert’ outcomes. If the ‘Alert’ and ‘Monitor’ actions are combined then performance increases to 82.6% correctly classified. We demonstrate how a decision support system based upon a naïve Bayes classifier could be operationalised within an operational syndromic surveillance system. Conclusions: Within syndromic surveillance systems, machine learning techniques have the potential to make risk assessment following statistical alarms more automated, robust, and rigorous. However, our results also highlight the importance of specialist human input to the process

    Exploring Campylobacter seasonality across Europe (2008-2016) using The European Surveillance System TESSy

    Get PDF
    Background: Campylobacteriosis is the most commonly reported food-borne infection in the European Union, with an annual number of cases estimated at around 9 million. In many countries, campylobacteriosis has a striking seasonal peak during early/ mid-summer. In the early 2000s, several publications reported on campylobacteriosis seasonality across Europe and associations with temperature and precipitation. Subsequently, many European countries have introduced new measures against this foodborne disease. Aim: To examine how the seasonality of campylobacteriosis varied across Europe from 2008–16, to explore associations with temperature and precipitation, and to compare these results with previous studies. We also sought to assess the utility of the European Surveillance System TESSy for cross-European seasonal analysis of campylobacteriosis. Methods: Ward’s Minimum Variance Clustering was used to group countries with similar seasonal patterns of campylobacteriosis. A two-stage multivariate meta-analysis methodology was used to explore associations with temperature and precipitation. Results: Nordic countries had a pronounced seasonal campylobacteriosis peak in mid-to late summer (weeks 29–32), while most other European countries had a less pronounced peak earlier in the year. The United Kingdom, Ireland, Hungary and Slovakia had a slightly earlier peak (week 24). Campylobacteriosis cases were positively associated with temperature and, to a lesser degree, precipitation. Conclusion: Across Europe, the strength and timing of campylobacteriosis peaks have remained similar to those observed previously. In addition, TESSy is a useful resource for cross-Euro-pean seasonal analysis of infectious diseases such as campylobacteriosis, but its utility depends upon each country’s reporting infrastructure

    Projecting the risk of mosquito-borne diseases in a warmer and more populated world: a multi-model, multi-scenario intercomparison modelling study

    Get PDF
    Background: Mosquito-borne diseases are expanding their range, and re-emerging in areas where they had subsided for decades. The extent to which climate change influences the transmission suitability and population at risk of mosquito-borne diseases across different altitudes and population densities has not been investigated. The aim of this study was to quantify the extent to which climate change will influence the length of the transmission season and estimate the population at risk of mosquito-borne diseases in the future, given different population densities across an altitudinal gradient. Methods: Using a multi-model multi-scenario framework, we estimated changes in the length of the transmission season and global population at risk of malaria and dengue for different altitudes and population densities for the period 1951-99. We generated projections from six mosquito-borne disease models, driven by four global circulation models, using four representative concentration pathways, and three shared socioeconomic pathways. Findings: We show that malaria suitability will increase by 1·6 additional months (mean 0·5, SE 0·03) in tropical highlands in the African region, the Eastern Mediterranean region, and the region of the Americas. Dengue suitability will increase in lowlands in the Western Pacific region and the Eastern Mediterranean region by 4·0 additional months (mean 1·7, SE 0·2). Increases in the climatic suitability of both diseases will be greater in rural areas than in urban areas. The epidemic belt for both diseases will expand towards temperate areas. The population at risk of both diseases might increase by up to 4·7 additional billion people by 2070 relative to 1970-99, particularly in lowlands and urban areas. Interpretation: Rising global mean temperature will increase the climatic suitability of both diseases particularly in already endemic areas. The predicted expansion towards higher altitudes and temperate regions suggests that outbreaks can occur in areas where people might be immunologically naive and public health systems unprepared. The population at risk of malaria and dengue will be higher in densely populated urban areas in the WHO African region, South-East Asia region, and the region of the Americas, although we did not account for urban-heat island effects, which can further alter the risk of disease transmission

    Environmental factors associated with general practitioner consultations for allergic rhinitis in London, England: a retrospective time series analysis

    Get PDF
    Objectives: To identify key predictors of general practitioner (GP) consultations for allergic rhinitis (AR) using meteorological and environmental data. Design: A retrospective, time series analysis of GP consultations for AR. Setting: A large GP surveillance network of GP practices in the London area. Participants: The study population was all persons who presented to general practices in London that report to the Public Health England GP in-hours syndromic surveillance system during the study period (3 April 2012 to 11 August 2014). Primary measure: Consultations for AR (numbers of consultations). Results: During the study period there were 186 401 GP consultations for AR. High grass and nettle pollen counts (combined) were associated with the highest increases in consultations (for the category 216-270 grains/m3, relative risk (RR) 3.33, 95% CI 2.69 to 4.12) followed by high tree (oak, birch and plane combined) pollen counts (for the category 260–325 grains/m3, RR 1.69, 95% CI 1.32 to 2.15) and average daily temperatures between 15°C and 20°C (RR 1.47, 95% CI 1.20 to 1.81). Higher levels of nitrogen dioxide (NO2) appeared to be associated with increased consultations (for the category 70–85 µg/m3, RR 1.33, 95% CI 1.03 to 1.71), but a significant effect was not found with ozone. Higher daily rainfall was associated with fewer consultations (15–20 mm/day; RR 0.812, 95% CI 0.674 to 0.980). Conclusions: Changes in grass, nettle or tree pollen counts, temperatures between 15°C and 20°C, and (to a lesser extent) NO2 concentrations were found to be associated with increased consultations for AR. Rainfall has a negative effect. In the context of climate change and continued exposures to environmental air pollution, intelligent use of these data will aid targeting public health messages and plan healthcare demand

    Climate change and the emergence of vector-borne diseases in Europe: Case study of dengue fever

    Get PDF
    Background: Dengue fever is the most prevalent mosquito-borne viral disease worldwide. Dengue transmission is critically dependent on climatic factors and there is much concern as to whether climate change would spread the disease to areas currently unaffected. The occurrence of autochthonous infections in Croatia and France in 2010 has raised concerns about a potential re-emergence of dengue in Europe. The objective of this study is to estimate dengue risk in Europe under climate change scenarios. Methods. We used a Generalized Additive Model (GAM) to estimate dengue fever risk as a function of climatic variables (maximum temperature, minimum temperature, precipitation, humidity) and socioeconomic factors (population density, urbanisation, GDP per capita and population size), under contemporary conditions (1985-2007) in Mexico. We then used our model estimates to project dengue incidence under baseline conditions (1961-1990) and three climate change scenarios: short-term 2011-2040, medium-term 2041-2070 and long-term 2071-2100 across Europe. The model was used to calculate average number of yearly dengue cases at a spatial resolution of 10 × 10 km grid covering all land surface of the currently 27 EU member states. To our knowledge, this is the first attempt to model dengue fever risk in Europe in terms of disease occurrence rather than mosquito presence. Results: The results were presented using Geographical Information System (GIS) and allowed identification of areas at high risk. Dengue fever hot spots were clustered around the coastal areas of the Mediterranean and Adriatic seas and the Po Valley in northern Italy. Conclusions: This risk assessment study is likely to be a valuable tool assisting effective and targeted adaptation responses to reduce the likely increased burden of dengue fever in a warmer world

    COVID-19 symptoms at hospital admission vary with age and sex: results from the ISARIC prospective multinational observational study

    Get PDF
    Background: The ISARIC prospective multinational observational study is the largest cohort of hospitalized patients with COVID-19. We present relationships of age, sex, and nationality to presenting symptoms. Methods: International, prospective observational study of 60 109 hospitalized symptomatic patients with laboratory-confirmed COVID-19 recruited from 43 countries between 30 January and 3 August 2020. Logistic regression was performed to evaluate relationships of age and sex to published COVID-19 case definitions and the most commonly reported symptoms. Results: ‘Typical’ symptoms of fever (69%), cough (68%) and shortness of breath (66%) were the most commonly reported. 92% of patients experienced at least one of these. Prevalence of typical symptoms was greatest in 30- to 60-year-olds (respectively 80, 79, 69%; at least one 95%). They were reported less frequently in children (≤ 18 years: 69, 48, 23; 85%), older adults (≥ 70 years: 61, 62, 65; 90%), and women (66, 66, 64; 90%; vs. men 71, 70, 67; 93%, each P < 0.001). The most common atypical presentations under 60 years of age were nausea and vomiting and abdominal pain, and over 60 years was confusion. Regression models showed significant differences in symptoms with sex, age and country. Interpretation: This international collaboration has allowed us to report reliable symptom data from the largest cohort of patients admitted to hospital with COVID-19. Adults over 60 and children admitted to hospital with COVID-19 are less likely to present with typical symptoms. Nausea and vomiting are common atypical presentations under 30 years. Confusion is a frequent atypical presentation of COVID-19 in adults over 60 years. Women are less likely to experience typical symptoms than men

    Empowering Latina scientists

    No full text

    Timing of nasogastric tube insertion and the risk of postoperative pneumonia: an international, prospective cohort study

    No full text
    Aim: Aspiration is a common cause of pneumonia in patients with postoperative ileus. Insertion of a nasogastric tube (NGT) is often performed, but this can be distressing. The aim of this study was to determine whether the timing of NGT insertion after surgery (before versus after vomiting) was associated with reduced rates of pneumonia in patients undergoing elective colorectal surgery. Method: This was a preplanned secondary analysis of a multicentre, prospective cohort study. Patients undergoing elective colorectal surgery between January 2018 and April 2018 were eligible. Those receiving a NGT were divided into three groups, based on the timing of the insertion: routine NGT (inserted at the time of surgery), prophylactic NGT (inserted after surgery but before vomiting) and reactive NGT (inserted after surgery and after vomiting). The primary outcome was the development of pneumonia within 30 days of surgery, which was compared between the prophylactic and reactive NGT groups using multivariable regression analysis. Results: A total of 4715 patients were included in the analysis and 1536 (32.6%) received a NGT. These were classified as routine in 926 (60.3%), reactive in 461 (30.0%) and prophylactic in 149 (9.7%). Two hundred patients (4.2%) developed pneumonia (no NGT 2.7%; routine NGT 5.2%; reactive NGT 10.6%; prophylactic NGT 11.4%). After adjustment for confounding factors, no significant difference in pneumonia rates was detected between the prophylactic and reactive NGT groups (odds ratio 1.03, 95% CI 0.56\u20131.87, P = 0.932). Conclusion: In patients who required the insertion of a NGT after surgery, prophylactic insertion was not associated with fewer cases of pneumonia within 30 days of surgery compared with reactive insertion

    Guidelines for the use and interpretation of assays for monitoring autophagy (4th edition)

    No full text
    In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field

    Thrombotic and hemorrhagic complications of COVID-19 in adults hospitalized in high-income countries compared with those in adults hospitalized in low- and middle-income countries in an international registry

    No full text
    Background: COVID-19 has been associated with a broad range of thromboembolic, ischemic, and hemorrhagic complications (coagulopathy complications). Most studies have focused on patients with severe disease from high-income countries (HICs). Objectives: The main aims were to compare the frequency of coagulopathy complications in developing countries (low- and middle-income countries [LMICs]) with those in HICs, delineate the frequency across a range of treatment levels, and determine associations with in-hospital mortality. Methods: Adult patients enrolled in an observational, multinational registry, the International Severe Acute Respiratory and Emerging Infections COVID-19 study, between January 1, 2020, and September 15, 2021, met inclusion criteria, including admission to a hospital for laboratory-confirmed, acute COVID-19 and data on complications and survival. The advanced-treatment cohort received care, such as admission to the intensive care unit, mechanical ventilation, or inotropes or vasopressors; the basic-treatment cohort did not receive any of these interventions. Results: The study population included 495,682 patients from 52 countries, with 63% from LMICs and 85% in the basic treatment cohort. The frequency of coagulopathy complications was higher in HICs (0.76%-3.4%) than in LMICs (0.09%-1.22%). Complications were more frequent in the advanced-treatment cohort than in the basic-treatment cohort. Coagulopathy complications were associated with increased in-hospital mortality (odds ratio, 1.58; 95% CI, 1.52-1.64). The increased mortality associated with these complications was higher in LMICs (58.5%) than in HICs (35.4%). After controlling for coagulopathy complications, treatment intensity, and multiple other factors, the mortality was higher among patients in LMICs than among patients in HICs (odds ratio, 1.45; 95% CI, 1.39-1.51). Conclusion: In a large, international registry of patients hospitalized for COVID-19, coagulopathy complications were more frequent in HICs than in LMICs (developing countries). Increased mortality associated with coagulopathy complications was of a greater magnitude among patients in LMICs. Additional research is needed regarding timely diagnosis of and intervention for coagulation derangements associated with COVID-19, particularly for limited-resource settings
    corecore