33 research outputs found

    A statistical modelling approach for source attribution meta-analysis of sporadic infection with foodborne pathogens

    Get PDF
    Numerous source attribution studies for foodborne pathogens based on epidemiological and microbiological methods are available. These studies provide empirical data for modelling frameworks that synthetize the quantitative evidence at our disposal and reduce reliance on expert elicitations. Here, we develop a statistical model within a Bayesian estimation framework to integrate attribution estimates from expert elicitations with estimates from microbial subtyping and case-control studies for sporadic infections with four major bacterial zoonotic pathogens in the Netherlands (Campylobacter, Salmonella, Shiga toxin-producing E. coli [STEC] O157 and Listeria). For each pathogen, we pooled the published fractions of human cases attributable to each animal reservoir from the microbial subtyping studies, accounting for the uncertainty arising from the different typing methods, attribution models, and year(s) of data collection. We then combined the population attributable fractions (PAFs) from the case-control studies according to five transmission pathways (domestic food, environment, direct animal contact, human-human transmission and travel) and 11 groups within the foodborne pathway (beef/lamb, pork, poultry meat, eggs, dairy, fish/shellfish, fruit/vegetables, beverages, grains, composite foods and food handlers/vermin). The attribution estimates were biologically plausible, allowing the human cases to be attributed in several ways according to reservoirs, transmission pathways and food groups. All pathogens were predominantly foodborne, with Campylobacter being mostly attributable to the chicken reservoir, Salmonella to pigs (albeit closely followed by layers), and Listeria and STEC O157 to cattle. Food-wise, the attributions reflected those at the reservoir level in terms of ranking. We provided a modelling solution to reach consensus attribution estimates reflecting the empirical evidence in the literature that is particularly useful for policy-making and is extensible to other pathogens and domains

    A statistical modelling approach for source attribution meta-analysis of sporadic infection with foodborne pathogens

    Get PDF
    Numerous source attribution studies for foodborne pathogens based on epidemiological and microbiological methods are available. These studies provide empirical data for modelling frameworks that synthetize the quantitative evidence at our disposal and reduce reliance on expert elicitations. Here, we develop a statistical model within a Bayesian estimation framework to integrate attribution estimates from expert elicitations with estimates from microbial subtyping and case-control studies for sporadic infections with four major bacterial zoonotic pathogens in the Netherlands (Campylobacter, Salmonella, Shiga toxin-producing E. coli [STEC] O157 and Listeria). For each pathogen, we pooled the published fractions of human cases attributable to each animal reservoir from the microbial subtyping studies, accounting for the uncertainty arising from the different typing methods, attribution models, and year(s) of data collection. We then combined the population attributable fractions (PAFs) from the case-control studies according to five transmission pathways (domestic food, environment, direct animal contact, human-human transmission and travel) and 11 groups within the foodborne pathway (beef/lamb, pork, poultry meat, eggs, dairy, fish/shellfish, fruit/vegetables, beverages, grains, composite foods and food handlers/vermin). The attribution estimates were biologically plausible, allowing the human cases to be attributed in several ways according to reservoirs, transmission pathways and food groups. All pathogens were predominantly foodborne, with Campylobacter being mostly attributable to the chicken reservoir, Salmonella to pigs (albeit closely followed by layers), and Listeria and STEC O157 to cattle. Food-wise, the attributions reflected those at the reservoir level in terms of ranking. We provided a modelling solution to reach consensus attribution estimates reflecting the empirical evidence in the literature that is particularly useful for policy-making and is extensible to other pathogens and domains

    Detection of opsonizing antibodies directed against a recently circulating Bordetella pertussis strain in paired plasma samples from symptomatic and recovered pertussis patients.

    Get PDF
    Correlates of protection (CoPs) against the highly contagious respiratory disease whooping cough, caused by Bordetella pertussis, remain elusive. Characterizing the antibody response to this pathogen is essential towards identifying potential CoPs. Here, we evaluate levels, avidity and functionality of B. pertussis-specific-antibodies from paired plasma samples derived from symptomatic and recovered pertussis patients, as well as controls. Natural infection is expected to induce protective immunity. IgG levels and avidity to nine B. pertussis antigens were determined using a novel multiplex panel. Furthermore,Ā opsonophagocytosis of a B. pertussis clinical isolate by neutrophils was measured. Findings indicate that following infection, B. pertussis-specific antibody levels of (ex-) pertussis patients waned, while the avidity of antibodies directed against the majority of studied antigens increased. Opsonophagocytosis indices decreased upon recovery, but remained higher than controls. Random forest analysis of all the data revealed that 28% of the opsonophagocytosis index variances could be explained by filamentous hemagglutinin- followed by pertussis toxin-specific antibodies. We propose to further explore which other B. pertussis-specific antibodies can better predict opsonophagocytosis. Moreover, other B. pertussis-specific antibody functions as well as the possible integration of these functions in combination with other immune cell properties should be evaluated towards the identification of CoPs against pertussis

    Test, trace, isolate:Evidence for declining SARS-CoV-2 PCR sensitivity in a clinical cohort

    Get PDF
    Real-time reverse transcription-polymerase chain reaction (RT-PCR) on upper respiratory tract (URT) samples is the primary method to diagnose SARS-CoV-2 infections and guide public health measures, with a supportive role for serology. We reinforce previous findings on limited sensitivity of PCR testing, and solidify this fact by statistically utilizing a firm basis of multiple tests per individual. We integrate stratifications with respect to several patient characteristics such as severity of disease and time since onset of symptoms. Bayesian statistical modelling was used to retrospectively determine the sensitivity of RT-PCR using SARS-CoV-2 serology in 644 COVID-19-suspected patients with varying degrees of disease severity and duration. The sensitivity of RT-PCR ranged between 80% āˆ’ 95%; increasing with disease severity, it decreased rapidly over time in mild COVID-19 cases. Negative URT RT-PCR results should be interpreted in the context of clinical characteristics, especially with regard to containment of viral transmission based on ā€˜test, trace and isolateā€™

    Quantifying reporting timeliness to improve outbreak control

    No full text
    The extent to which reporting delays should be reduced to gain substantial improvement in outbreak control is unclear. We developed a model to quantitatively assess reporting timeliness. Using reporting speed data for 6 infectious diseases in the notification system in the Netherlands, we calculated the proportion of infections produced by index and secondary cases until the index case is reported. We assumed interventions that immediately stop transmission. Reporting delays render useful only those interventions that stop transmission from index and secondary cases. We found that current reporting delays are adequate for hepatitis A and B control. However, reporting delays should be reduced by a few days to improve measles and mumps control, by at least 10 days to improve shigellosis control, and by at least 5 weeks to substantially improve pertussis control. Our method provides quantitative insight into the required reporting delay reductions needed to achieve outbreak control and other transmission prevention goals

    Acute illness from Campylobacter jejuni may require high doses while infection occurs at low doses.

    No full text
    Data from a set of different studies on the infectivity and pathogenicity of Campylobacter jejuni were analyzed with a multilevel model, allowing for effects of host species (nonhuman primates and humans) and different strains of the pathogen. All challenge studies involved high doses of the pathogen, resulting in all exposed subjects to become infected. In only one study a dose response effect (increasing trend with dose) for infection was observed. High susceptibility to infection with C. jejuni was found in a joint analysis of outbreaks and challenge studies. For that reason four outbreaks, associated with raw milk consumption, were also included in the present study. The high doses used for inoculation did not cause all infected subjects to develop acute enteric symptoms. The observed outcomes are consistent with a dose response effect for acute symptoms among infected subjects: a conditional illness dose response relation. Nonhuman primates and human volunteers did not appear to have different susceptibilities for developing enteric symptoms, but exposure in outbreaks (raw milk) did lead to a higher probability of symptomatic campylobacteriosis

    Timeliness of infectious disease reporting, the Netherlands, 2003 to 2017: law change reduced reporting delay, disease identification delay is next.

    No full text
    BackgroundTimely notification of infectious diseases is essential for effective disease control and needs regular evaluation.AimOur objective was to evaluate the effects that statutory adjustments in the Netherlands in 2008 and raising awareness during outbreaks had on notification timeliness.MethodsIn a retrospective analyses of routine surveillance data obtained between July 2003 and November 2017, delays between disease onset and laboratory confirmation (disease identification delay), between laboratory confirmation and notification to Municipal Health Services (notification delay) and between notification and reporting to the National Institute for Public Health and the Environment (reporting delay) were analysed for 28 notifiable diseases. Delays before (period 1) and after the law change (periods 2 and 3) were compared with legal timeframes. We studied the effect of outbreak awareness in 10 outbreaks and the effect of specific guidance messages on disease identification delay for two diseases.ResultsWe included 144,066 notifications. Average notification delay decreased from 1.4 to 0.4 days across the three periods (six diseases; pā€‰<ā€‰0.05), reporting delay decreased mainly in period 2 (from 0.5 to 0.1 days, six diseases; pā€‰<ā€‰0.05). In 2016-2017, legal timeframes were met overall. Awareness resulted in decreased disease identification delay for three diseases: measles and rubella (outbreaks) and psittacosis (specific guidance messages).ConclusionsLegal adjustments decreased notification and reporting delays, increased awareness reduced identification delays. As disease identification delay dominates the notification chain, insight in patient, doctor and laboratory delay is necessary to further improve timeliness and monitor the impact of control measures during outbreaks
    corecore