347 research outputs found

    Evolution of an epidemic: Understanding the opioid epidemic in the United States and the impact of the COVID-19 pandemic on opioid-related mortality

    Get PDF
    We conduct this research with a two-fold aim: providing a quantitative analysis of the opioid epidemic in the United States (U.S.), and exploring the impact of the COVID-19 pandemic on opioid-related mortality. The duration and persistence of the opioid epidemic lends itself to the need for an overarching analysis with extensive scope. Additionally, studying the ramifications of these concurrent severe public health crises is vital for informing policies to avoid preventable mortality. Using data from CDC WONDER, we consider opioid-related deaths grouped by Census Region spanning January 1999 to October 2022 inclusive, and later add on a demographic component with gender-stratification. Through the lens of key events in the opioid epidemic, we build an interrupted time series model to reveal statistically significant drivers of opioid-related mortality. We then employ a counterfactual to approximate trends in the absence of COVID-19, and estimate excess opioid-related deaths (defined as observed opioid-related deaths minus projected opioid-related deaths) associated with the pandemic. According to our model, the proliferation of fentanyl contributed to sustained increases in opioid-related death rates across three of the four U.S. census regions, corroborating existing knowledge in the field. Critically, each region has an immediate increase to its opioid-related monthly death rate of at least 0.31 deaths per 100,000 persons at the start of the pandemic, highlighting the nationwide knock-on effects of COVID-19. There are consistent positive deviations from the expected monthly opioid-related death rate and a sizable burden from cumulative excess opioid-related deaths, surpassing 60,000 additional deaths nationally from March 2020 to October 2022, ∼70% of which were male. These results suggest that robust, multi-faceted measures are even more important in light of the COVID-19 pandemic to prevent overdoses and educate users on the risks associated with potent synthetic opioids such as fentanyl

    The contribution of badgers to confirmed tuberculosis in cattle in high-incidence areas in England

    Get PDF
    The role of badgers in the transmission and maintenance of bovine tuberculosis (TB) in British cattle is widely debated as part of the wider discussions on whether badger culling and/or badger vaccination should play a role in the government’s strategy to eradicate cattle TB. The key source of information on the contribution from badgers within high-cattle-TB-incidence areas of England is the Randomised Badger Culling Trial (RBCT), with two analyses providing estimates of the average overall contribution of badgers to confirmed cattle TB in these areas. A dynamical model characterizing the association between the estimated prevalence of Mycobacterium bovis (the causative agent of bovine TB) among badgers culled in the initial RBCT proactive culls and the incidence among sympatric cattle herds prior to culling is used to estimate the average overall contribution of badgers to confirmed TB herd breakdowns among proactively culled areas. The resulting estimate based on all data (52%) has considerable uncertainty (bootstrap 95% confidence interval (CI): 9.1-100%). Separate analyses of experimental data indicated that the largest estimated reduction in confirmed cattle TB achieved inside the proactive culling areas was 54% (overdispersion-adjusted 95% CI: 38-66%), providing a lower bound for the average overall contribution of badgers to confirmed cattle TB. Thus, taking into account both results, the best estimate of the average overall contribution of badgers is roughly half, with 38% being a robustly estimated lower bound. However, the dynamical model also suggested that only 5.7% (bootstrap 95% CI: 0.9-25%) of the transmission to cattle herds is badger-to-cattle with the remainder of the average overall contribution from badgers being in the form of onward cattle-to-cattle transmission. These estimates, confirming that badgers do play a role in bovine TB transmission, inform debate even if they do not point to a single way forward

    Factors determining the pattern of the variant Creutzfeldt-Jakob disease (vCJD) epidemic in the UK.

    No full text
    Following the emergence of a new variant of Creutzfeldt-Jakob disease (vCJD) 6 years ago, and the gradual rise in clinical cases, there has been increased speculation regarding the overall magnitude of this epidemic in Great Britain. In this paper, we explore the epidemiological factors and uncertainties determining the scale of this epidemic in light of the most recent data on reported vCJD mortality. Our results demonstrate that, while the magnitude of the uncertainty has decreased dramatically since 1996, it is still not possible to predict with any degree of accuracy the final magnitude of this epidemic, with the 95% confidence interval for future cases being from 10 to 7000 deaths. However, short-term projections show that it is unlikely that a dramatic increase in case numbers will be observed in the next 2-5 years (95% confidence interval for 2 years: 10-80 cases, for 5 years: 10-200 cases). The results confirm significant age-dependent susceptibility/exposure to infection, with the likelihood profile demonstrating that those aged between 10 and 20 years are at highest risk of infection. We also demonstrate how projections based on onset data may be substantially biased, and explore the sensitivity of results to assumptions concerning the exposure to bovine spongiform encephalopathy (BSE) and the incubation-period distribution

    The Duration of the Effects of Repeated Widespread Badger Culling on Cattle Tuberculosis Following the Cessation of Culling

    Get PDF
    Background: In the British Isles, control of cattle tuberculosis (TB) is hindered by persistent infection of wild badger (Meles meles) populations. A large-scale field trial—the Randomised Badger Culling Trial (RBCT)—previously showed that widespread badger culling produced modest reductions in cattle TB incidence during culling, which were offset by elevated TB risks for cattle on adjoining lands. Once culling was halted, beneficial effects inside culling areas increased, while detrimental effects on adjoining lands disappeared. However, a full assessment of the utility of badger culling requires information on the duration of culling effects. Methodology/Principal Findings: We monitored cattle TB incidence in and around RBCT areas after culling ended. We found that benefits inside culled areas declined over time, and were no longer detectable by three years post-culling. On adjoining lands, a trend suggesting beneficial effects immediately after the end of culling was insignificant, and disappeared after 18 months post-culling. From completion of the first cull to the loss of detectable effects (an average five-year culling period plus 2.5 years post-culling), cattle TB incidence was 28.7% lower (95% confidence interval [CI] 20.7 to 35.8% lower) inside ten 100 km2 culled areas than inside ten matched no-culling areas, and comparable (11.7% higher, 95% CI: 13.0% lower to 43.4% higher, p = 0.39) on lands #2 km outside culled and no-culling areas. The financial costs of culling an idealized 150 km2 area would exceed the savings achieved through reduced cattle TB, by factors of 2 to 3.5. Conclusions/Significance: Our findings show that the reductions in cattle TB incidence achieved by repeated badger culling were not sustained in the long term after culling ended and did not offset the financial costs of culling. These results, combined with evaluation of alternative culling methods, suggest that badger culling is unlikely to contribute effectively to the control of cattle TB in Britain

    Are epidemic growth rates more informative than reproduction numbers?

    Get PDF
    Summary statistics, often derived from simplified models of epidemic spread, inform public health policy in real time. The instantaneous reproduction number, Rt, is predominant among these statistics, measuring the average ability of an infection to multiply. However, Rt encodes no temporal information and is sensitive to modelling assumptions. Consequently, some have proposed the epidemic growth rate, rt, i.e., the rate of change of the log-transformed case incidence, as a more temporally meaningful and model-agnostic policy guide. We examine this assertion, identifying if and when estimates of rt are more informative than those of Rt. We assess their relative strengths both for learning about pathogen transmission mechanisms and for guiding public health interventions in real time

    Stochastic modelling of African swine fever in wild boar and domestic pigs: epidemic forecasting and comparison of disease management strategies

    Get PDF
    African swine fever (ASF), caused by the African swine fever virus (ASFV), is highly virulent in domestic pigs and wild boar (Sus scrofa), causing up to 100% mortality. The recent epidemic of ASF in Europe has had a serious economic impact and poses a threat to global food security. Unfortunately, there is no effective treatment or vaccine against ASFV, limiting the available disease management strategies. Mathematical models allow us to further our understanding of infectious disease dynamics and evaluate the efficacy of disease management strategies. The ASF Challenge, organised by the French National Research Institute for Agriculture, Food, and the Environment, aimed to expand the development of ASF transmission models to inform policy makers in a timely manner. Here, we present the model and associated projections produced by our team during the challenge. We developed a stochastic model combining transmission between wild boar and domestic pigs, which was calibrated to synthetic data corresponding to different phases describing the epidemic progression. The model was then used to produce forward projections describing the likely temporal evolution of the epidemic under various disease management scenarios. Despite the interventions implemented, long-term projections forecasted persistence of ASFV in wild boar, and hence repeated outbreaks in domestic pigs. A key finding was that it is important to consider the timescale over which different measures are evaluated: interventions that have only limited effectiveness in the short term may yield substantial long-term benefits. Our model has several limitations, partly because it was developed in real-time. Nonetheless, it can inform understanding of the likely development of ASF epidemics and the efficacy of disease management strategies, should the virus continue its spread in Europe

    Leaping through tree space: continuous phylogenetic inference for rooted and unrooted trees

    Get PDF
    Phylogenetics is now fundamental in life sciences, providing insights into the earliest branches of life and the origins and spread of epidemics. However, finding suitable phylogenies from the vast space of possible trees remains challenging. To address this problem, for the first time, we perform both tree exploration and inference in a continuous space where the computation of gradients is possible. This continuous relaxation allows for major leaps across tree space in both rooted and unrooted trees, and is less susceptible to convergence to local minima. Our approach outperforms the current best methods for inference on unrooted trees and, in simulation, accurately infers the tree and root in ultrametric cases. The approach is effective in cases of empirical data with negligible amounts of data, which we demonstrate on the phylogeny of jawed vertebrates. Indeed, only a few genes with an ultrametric signal were generally sufficient for resolving the major lineages of vertebrate. With cubic-time complexity and efficient optimisation via automatic differentiation, our method presents an effective way forwards for exploring the most difficult, data-deficient phylogenetic questions.Comment: 13 pages, 4 figures, 14 supplementary pages, 2 supplementary figure

    Modelling the influence of naturally acquired immunity from subclinical infection on outbreak dynamics and persistence of rabies in domestic dogs

    Get PDF
    A number of mathematical models have been developed for canine rabies to explore dynamics and inform control strategies. A common assumption of these models is that naturally acquired immunity plays no role in rabies dynamics. However, empirical studies have detected rabies-specific antibodies in healthy, unvaccinated domestic dogs, potentially due to immunizing, non-lethal exposure. We developed a stochastic model for canine rabies, parameterised for Laikipia County, Kenya, to explore the implications of different scenarios for naturally acquired immunity to rabies in domestic dogs. Simulating these scenarios using a non-spatial model indicated that low levels of immunity can act to limit rabies incidence and prevent depletion of the domestic dog population, increasing the probability of disease persistence. However, incorporating spatial structure and human response to high rabies incidence allowed the virus to persist in the absence of immunity. While low levels of immunity therefore had limited influence under a more realistic approximation of rabies dynamics, high rates of exposure leading to immunizing non-lethal exposure were required to produce population-level seroprevalences comparable with those reported in empirical studies. False positives and/or spatial variation may contribute to high empirical seroprevalences. However, if high seroprevalences are related to high exposure rates, these findings support the need for high vaccination coverage to effectively control this disease
    corecore