17 research outputs found

    Population-Level Human Secretor Status Is Associated With Genogroup 2 Type 4 Norovirus Predominance

    Get PDF
    Background. Noroviruses are a leading cause of acute gastroenteritis. Genogroup 2 type 4 (GII.4) has been the dominant norovirus genotype worldwide since its emergence in the mid-1990s. Individuals with a functional fucosyltransferase-2 gene, known as secretors, have increased susceptibility to GII.4 noroviruses. We hypothesized that this individual-level trait may drive GII.4 norovirus predominance at the human population level. Methods. We conducted a systematic review for studies reporting norovirus outbreak or sporadic case genotypes and merged this with data on proportions of human secretor status in various countries from a separate systematic review. We used inverse variance-weighted linear regression to estimate magnitude of the population secretor-GII.4 proportion association. Results. Two hundred nineteen genotype and 112 secretor studies with data from 38 countries were included in the analysis. Study-level GII.4 proportion among all noroviruses ranged from 0% to 100%. Country secretor proportion ranged from 43.8% to 93.9%. We observed a 0.69% (95% confidence interval, 0.19-1.18) increase in GII.4 proportion for each percentage increase in human secretor proportion, controlling for Human Development Index. Conclusions. Norovirus evolution and diversity may be driven by local population human host genetics. Our results may have vaccine development implications including whether specific antigenic formulations would be required for different populations.This work was funded by the National Institutes of Health/National Institute of General Medical Sciences (R01GM124280; to B. L.) and the Centers for Disease Control and Prevention (IPA 48195; to B. L.)

    GII.4 detection rates worldwide among norovirus-positive children with sporadic acute gastroenteritis.

    No full text
    <p>Data were collected from 53 studies conducted in 31 different countries among children with sporadic acute gastroenteritis. Studies with samples collected after the year 2000 with sample collection done for a minimum of one year and based on genotyping of the capsid region are included. Due to lack of sufficient available data from Africa, studies conducted for less than one year have also been included for this region. The data shown are from a total of 6,283 genotyped samples from 28 different studies from Asia, 13 from Africa, five from South America, one from North America, and three each from Europe and Central America (<a href="http://www.plospathogens.org/article/info:doi/10.1371/journal.ppat.1005385#ppat.1005385.s001" target="_blank">S1 Table</a>). The number (n) in the figure represents the number of genotyped samples, while the duration of sample collection for each region is mentioned within the parentheses. GII.NT represent non-typeables. Broken lines represent approximate percentages of secretors in the population.</p

    Predicting norovirus and rotavirus resurgence in the United States following the COVID-19 pandemic: a mathematical modelling study

    No full text
    Abstract Background To reduce the burden from the COVID-19 pandemic in the United States, federal and state local governments implemented restrictions such as limitations on gatherings, restaurant dining, and travel, and recommended non-pharmaceutical interventions including physical distancing, mask-wearing, surface disinfection, and increased hand hygiene. Resulting behavioral changes impacted other infectious diseases including enteropathogens such as norovirus and rotavirus, which had fairly regular seasonal patterns prior to the COVID-19 pandemic. The study objective was to project future incidence of norovirus and rotavirus gastroenteritis as contacts resumed and other NPIs are relaxed. Methods We fitted compartmental mathematical models to pre-pandemic U.S. surveillance data (2012–2019) for norovirus and rotavirus using maximum likelihood estimation. Then, we projected incidence for 2022–2030 under scenarios where the number of contacts a person has per day varies from70%, 80%, 90%, and full resumption (100%) of pre-pandemic levels. Results We found that the population susceptibility to both viruses increased between March 2020 and November 2021. The 70–90% contact resumption scenarios led to lower incidence than observed pre-pandemic for both viruses. However, we found a greater than two-fold increase in community incidence relative to the pre-pandemic period under the 100% contact scenarios for both viruses. With rotavirus, for which population immunity is driven partially by vaccination, patterns settled into a new steady state quickly in 2022 under the 70–90% scenarios. For norovirus, for which immunity is relatively short-lasting and only acquired through infection, surged under the 100% contact scenario projection. Conclusions These results, which quantify the consequences of population susceptibility build-up, can help public health agencies prepare for potential resurgence of enteric viruses

    Factors driving norovirus transmission in long-term care facilities: A case-level analysis of 107 outbreaks

    No full text
    Norovirus is the most common cause of gastroenteritis outbreaks in long-term care facilities (LTCFs) in the United States, causing a high burden of disease in both residents and staff. Understanding how case symptoms and characteristics contribute to norovirus transmission can lead to more informed outbreak control measures in LTCFs. We examined line lists for 107 norovirus outbreaks that took place in LTCFs in five U.S. states from 2015 to 2019. We estimated the individual effective reproduction number, Ri, to quantify individual case infectiousness and examined the contribution of vomiting, diarrhea, and being a resident (vs. staff) to case infectiousness. The associations between case characteristics and Ri were estimated using a multivariable, log-linear mixed model with inverse variance weighting. We found that cases with vomiting infected 1.28 (95 % CI: 1.11, 1.48) times the number of secondary cases compared to cases without vomiting, and LTCF residents infected 1.31 (95 % CI: 1.15, 1.50) times the number of secondary cases compared to staff. There was no difference in infectiousness between cases with and without diarrhea (1.07; 95 % CI: 0.90, 1.29). This suggests that vomiting, particularly by LTCF residents, was a primary driver of norovirus transmission. These results support control measures that limit exposure to vomitus during norovirus outbreaks in LTCFs

    Effect of Triage-Based Use of the Ottawa Foot and Ankle Rules on the Number of Orders for Radiographic Imaging.

    No full text
    CONTEXT: Reducing unnecessary testing lessens the cost burden of medical care, but decreasing use depends on consistently following evidence-based clinical decision rules. The Ottawa foot and ankle rules (OFARs) are validated, longstanding evidence-based guidelines to predict fractures. Frequently, radiography is automatically ordered for acute ankle injuries despite findings from OFARs suggesting no fracture. OBJECTIVES: First, to determine whether implementation of protocol-driven use of the OFARs at triage would decrease the number of radiography orders and length of stay (LOS) in the emergency department. Second, to quantify the incidence of OFARs use at triage and to assess patient expectations of radiography use and patient satisfaction as rated by both patients and clinicians. METHODS: In this prospective, 2-stage sequential pilot study, patients with acute ankle and foot injuries were screened in the emergency department between January 2013 and October 2013. In the first stage, clinicians (physician assistants, residents, and attending physicians) performed their usual practice habits for radiography use in the control group. For the second stage, they were educated to appropriately apply the OFARs before ordering radiography. For patients who were suspected of having a fracture at triage, nursing staff ordered radiography. For patients who were not suspected of having a fracture at triage, a clinician reassessed them using the OFARs after their triage assessment. Radiography was then ordered at the discretion of the clinician. Results gathered after training in the OFARs comprised the intervention group. After discharge, patients were surveyed regarding their expectations and satisfaction, and clinicians were surveyed on their perceptions of patient satisfaction. RESULTS: A total of 131 patients were screened, 62 patients were enrolled in the study after consent was obtained, and 2 patients withdrew from the study prematurely, leaving 30 patients in each group. Fifty-eight of the 60 patients (97%) underwent radiography. Emergency department LOS decreased from 103 minutes to 96.5 minutes (P=.297) after the OFARs were applied. There was also a decrease in LOS in patients with a fracture (137 minutes vs 103 minutes [P=.112]). Radiography was expected to be ordered by 27 of 30 patients in the control group (90%) and 24 of 30 in the intervention group (80%) (P=.472). Patients were equally satisfied among the groups (54 of 60 [90%]) (with no difference between groups), and 27 of 30 (90%) vs 30 of 30 (100%) clinicians in the control and intervention groups, respectively, perceived that patients were satisfied with their treatment. CONCLUSION: There was no statistical evidence that application of the OFARs decreases the number of imaging orders or decreases LOS. This observation suggests that even when clinicians are being observed and instructed to use clinical decision rules, their evaluation bias tends toward recommendations for testing
    corecore