4,371 research outputs found

    Spatial heterogeneity of habitat suitability for Rift Valley fever occurrence in Tanzania: an ecological niche modelling approach

    Get PDF
    Despite the long history of Rift Valley fever (RVF) in Tanzania, extent of its suitable habitat in the country remains unclear. In this study we investigated potential effects of temperature, precipitation, elevation, soil type, livestock density, rainfall pattern, proximity to wild animals, protected areas and forest on the habitat suitability for RVF occurrence in Tanzania. Presence-only records of 193 RVF outbreak locations from 1930 to 2007 together with potential predictor variables were used to model and map the suitable habitats for RVF occurrence using ecological niche modelling. Ground-truthing of the model outputs was conducted by comparing the levels of RVF virus specific antibodies in cattle, sheep and goats sampled from locations in Tanzania that presented different predicted habitat suitability values. Habitat suitability values for RVF occurrence were higher in the northern and central-eastern regions of Tanzania than the rest of the regions in the country. Soil type and precipitation of the wettest quarter contributed equally to habitat suitability (32.4% each), followed by livestock density (25.9%) and rainfall pattern (9.3%). Ground-truthing of model outputs revealed that the odds of an animal being seropositive for RVFV when sampled from areas predicted to be most suitable for RVF occurrence were twice the odds of an animal sampled from areas least suitable for RVF occurrence (95% CI: 1.43, 2.76, p < 0.001). The regions in the northern and central-eastern Tanzania were more suitable for RVF occurrence than the rest of the regions in the country. The modelled suitable habitat is characterised by impermeable soils, moderate precipitation in the wettest quarter, high livestock density and a bimodal rainfall pattern. The findings of this study should provide guidance for the design of appropriate RVF surveillance, prevention and control strategies which target areas with these characteristics

    Association between herd management practices and antimicrobial resistance in Salmonella spp. from cull dairy cattle in Central California.

    Get PDF
    BackgroundIn this study cull dairy cows from six California dairy herds were sampled seasonally over the course of a year. The objectives were to determine the prevalence of antimicrobial resistant (AMR) Salmonella spp. shed in cull cow feces, and the factors associated with fecal shedding of AMR and multidrug resistant (MDR) Salmonella.MethodsSix dairy farms located in the San Joaquin Valley of California were identified and enrolled as a convenience sample. On each dairy, and once during each of the four seasons, 10 cull cows were randomly selected for fecal sampling on the day of their removal from the herd. In addition, study personnel completed a survey based on responses of the herd manager to questions related to the previous 4 month's herd management and the specific cattle sampled. Fecal samples were submitted to the California Animal Health and Food Safety laboratory for Salmonella isolation. Antimicrobial resistance was evaluated using broth microdilution method and a gram-negative assay plate following Clinical Laboratory Standards Institute (CLSI) guidelines and breakpoint references. All statistical models were survey adjusted for number of animals on sampling day.ResultsA total of 62 Salmonella were isolated from 60 of the 239 fecal samples collected. For 12% (95% confidence interval (CI) [3-20]) of fecal samples a multidrug resistant Salmonella was isolated. The survey-weighted results for the two most common drug classes for which isolates were resistant were tetracycline (39%; 95% CI [27-51]) and ampicillin (18%; 95% CI [9-27]). An important finding was the identification of cephalosporin as the third most common drug class for which isolates were resistant, with ceftriaxone (10%; 95% CI [2-17]) being the most common drug associated with resistance in that class. At the cow-level, reason for culling, prior treatment with antimicrobial drugs as the reason for culling was associated with higher odds of isolating an AMR Salmonella isolate. At the herd-level, percent of animals monthly culled on the farm as well as number of milking cows in the herd were associated with isolation of antimicrobial resistant Salmonella in cull cows.DiscussionSalmonella isolated from fecal samples from cull cows were resistant to important antimicrobials, such as ceftriaxone. The most common drug classes for which isolates were resistant were tetracyclines and beta-lactams, with ampicillin, ceftriaxone and ceftiofur being the three most common drugs within the latter. Cow and herd level factors were associated with isolating antimicrobial resistant Salmonella that should be further investigated for their potential role in promoting occurrence of AMR Salmonella. Our results also highlight the importance of monitoring dairy cattle sent to slaughter for shedding of Salmonella resistant to medically important antimicrobial drugs

    Bayesian data assimilation provides rapid decision support for vector-borne diseases

    Get PDF
    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Whereas host population data is typically available, for novel disease introductions there is a high chance of the pathogen utilising a vector for which data is unavailable. This presents a barrier to estimating the parameters of dynamical models representing host-vector-pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times, and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds, and provides evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks

    Bovine tuberculosis in Swedish farmed deer

    Get PDF
    Bovine tuberculosis (BTB) was introduced into Swedish farmed deer herds in 1987. Epidemiological investigations showed that 10 deer herds had become infected (July 1994) and a common source of infection, a consignment of 168 imported farmed fallow deer, was identified (I). As trace-back of all imported and in-contact deer was not possible, a control program, based on tuberculin testing, was implemented in July 1994. As Sweden has been free from BTB since 1958, few practising veterinarians had experience in tuberculin testing. In this test, result relies on the skill, experience and conscientiousness of the testing veterinarian. Deficiencies in performing the test may adversely affect the test results and thereby compromise a control program. Quality indicators may identify possible deficiencies in testing procedures. For that purpose, reference values for measured skin fold thickness (prior to injection of the tuberculin) were established (II) suggested to be used mainly by less experienced veterinarians to identify unexpected measurements. Furthermore, the within-veterinarian variation of the measured skin fold thickness was estimated by fitting general linear models to data (skin fold measurements) (III). The mean square error was used as an estimator of the within-veterinarian variation. Using this method, four (6%) veterinarians were considered to have unexpectedly large variation in measurements. In certain large extensive deer farms, where mustering of all animals was difficult, meat inspection was suggested as an alternative to tuberculin testing. The efficiency of such a control was estimated in paper IV and V. A Reed Frost model was fitted to data from seven BTB-infected deer herds and the spread of infection was estimated (< 0.6 effective contacts per deer and year) (IV). These results were used to model the efficiency of meat inspection in an average extensive Swedish deer herd. Given a 20% annual slaughter and meat inspection, the model predicted that BTB would be either detected or eliminated in most herds (90%) 15 years after introduction of one infected deer. In 2003, an alternative control for BTB in extensive Swedish deer herds, based on the results of paper V, was implemented

    Assessing the impact of tailored biosecurity advice on farmer behaviour and pathogen presence in beef herds in England and Wales

    Get PDF
    The term ‘biosecurity’ encompasses many measures farmers can take to reduce the risk of pathogen incursion or spread. As the best strategy will vary between settings, veterinarians play an important role in assessing risk and providing advice, but effectiveness requires farmer acceptance and implementation. The aim of this study was to assess the effectiveness of specifically-tailored biosecurity advice packages in reducing endemic pathogen presence on UK beef suckler farms. One hundred and sixteen farms recruited by 10 veterinary practices were followed for three years. Farms were randomly allocated to intervention (receiving specifically-tailored advice, with veterinarians and farmers collaborating to develop an improved biosecurity strategy) or control (receiving general advice) groups. A spreadsheet-based tool was used annually to attribute a score to each farm reflecting risk of entry or spread of bovine viral diarrhoea virus (BVDV), bovine herpesvirus-1 (BHV1), Mycobacterium avium subsp. paratuberculosis (MAP), Leptospira interrogans serovar hardjo (L. hardjo) and Mycobacterium bovis (M. bovis). Objectives of these analyses were to identify evidence of reduction in risk behaviours during the study, as well as evidence of reductions in pathogen presence, as indications of effectiveness. Risk behaviours and pathogen prevalences were examined across study years, and on intervention compared with control farms, using descriptive statistics and multilevel regression. There were significant reductions in risk scores for all five pathogens, regardless of intervention status, in every study year compared with the outset. Animals on intervention farms were significantly less likely than those on control farms to be seropositive for BVDV in years 2 and 3 and for L. hardjo in year 3 of the study. Variations by study year in animal-level odds of seropositivity to BHV1 or MAP were not associated with farm intervention status. All farms had significantly reduced odds of BHV1 seropositivity in year 2 than at the outset. Variations in farm-level MAP seropositivity were not associated with intervention status. There were increased odds of M. bovis on intervention farms compared with control farms at the end of the study. Results suggest a structured annual risk assessment process, conducted as a collaboration between veterinarian and farmer, is valuable in encouraging improved biosecurity practices. There were some indications, but not conclusive evidence, that tailored biosecurity advice packages have potential to reduce pathogen presence. These findings will inform development of a collaborative approach to biosecurity between veterinarians and farmers, including adoption of cost-effective strategies effective across pathogens

    Intensified control of Salmonella and Campylobacter in fresh meat - case-by-case based risk assessment

    Get PDF
    • 

    corecore