35 research outputs found

    Examining Seasonal Anthrax Risk in Wildlife: Comparing Home Ranges and Site Fidelity in Sero-Positive and Sero-Negative Ungulates

    Get PDF
    Anthrax is frequently reported from wildlife and livestock in the US.  While useful in reducing risk in livestock, vaccination, the primary method of prevention, is untenable for free-ranging wildlife. Because of this, accurate surveillance and carcass clean-up are the most efficacious control measures for wildlife.  However, surveillance is expensive and requires significant personnel across large landscapes. Likewise, the transmission pathways are poorly understood in most species. Wildlife telemetry improves our understanding of movement patterns during risk periods. At the same time, serological surveys provide data on host exposure. Such data allow us to test hypotheses about host/pathogen interactions on the landscape. Starting in 2010, we initiated GPS telemetry and sero-prevalence studies for managed bison, Bison bison bison, and free-range elk (Cervus elaphus) in Montana. Here we will evaluate summertime home ranges in bulls from both species in western Montana. We compared home ranges and site fidelity metrics in sero-positive and sero-negative animals. Serological tests indicated that ~30% of bull elk and ~27% of unvaccinated bison were sero-positive for anthrax exposure, suggesting that low-level exposure is frequent on this landscape. Seasonal ranges can be useful for defining areas where animals may have increased likelihood of anthrax, comparing ranges to niche-based estimates of B. anthracis. Fidelity metrics suggest both species spent considerable time in niche-based high risk areas. Inter-annual data from elk suggest long-term range fidelity and overlap with high risk areas. These data can be used to prioritize surveillance efforts in those areas to maximize disease control, while managing search costs

    Coupling remote sensing and eDNA to monitor environmental impact: A pilot to quantify the environmental benefits of sustainable agriculture in the Brazilian Amazon

    Get PDF
    Monitoring is essential to ensure that environmental goals are being achieved, including those of sustainable agriculture. Growing interest in environmental monitoring provides an opportunity to improve monitoring practices. Approaches that directly monitor land cover change and biodiversity annually by coupling the wall-to-wall coverage from remote sensing and the site-specific community composition from environmental DNA (eDNA) can provide timely, relevant results for parties interested in the success of sustainable agricultural practices. To ensure that the measured impacts are due to the environmental projects and not exogenous factors, sites where projects have been implemented should be benchmarked against counterfactuals (no project) and control (natural habitat) sites. Results can then be used to calculate diverse sets of indicators customized to monitor different projects. Here, we report on our experience developing and applying one such approach to assess the impact of shaded cocoa projects implemented by the Instituto de Manejo e Certificação Florestal e Agrícola (IMAFLORA) near São Félix do Xingu, in Pará, Brazil. We used the Continuous Degradation Detection (CODED) and LandTrendr algorithms to create a remote sensing-based assessment of forest disturbance and regeneration, estimate carbon sequestration, and changes in essential habitats. We coupled these remote sensing methods with eDNA analyses using arthropod-targeted primers by collecting soil samples from intervention and counterfactual pasture field sites and a control secondary forest. We used a custom set of indicators from the pilot application of a coupled monitoring framework called TerraBio. Our results suggest that, due to IMAFLORA’s shaded cocoa projects, over 400 acres were restored in the intervention area and the community composition of arthropods in shaded cocoa is closer to second-growth forests than that of pastures. In reviewing the coupled approach, we found multiple aspects worked well, and we conclude by presenting multiple lessons learned

    Modeling Indirect Transmission Disease Risk: Anthrax in Bison in Southwestern Montana

    No full text
    The patterns of movement and space use by animals as they transverse landscapes can affect their chance of encountering pathogens in the environment, and therefore can alter their risk of disease transmission. This is particularly true when the pathogen can persist in the environment for long periods of time. One such disease, anthrax, caused by the bacterium Bacillus anthracis, is a worldwide zoonotic disease of concern. The objectives of this research are to investigate bison space use in relation to risk of anthrax transmission, to develop guidelines on creation of resource selection functions, and to create a compartmental model of anthrax as an indirectly transmitted disease in bison to estimate the basic reproductive number which is a epidemiological metric. My study area was primarily based in a re-emerging anthrax disease system in southwestern Montana, where a multi-species anthrax outbreak event occurred on a ranch in 2008. No cases had been reported in this region in decades, however, positive serology results in later years suggest that there is continued exposure to Bacillus anthracis. I investigated the movement patterns of the GPS-collared bison on this ranch using home-range estimators and subsequent analyses of the resulting polygons. I found individual variation in the potential risk of their home ranges, and a bison that exhibited serology results showing recent prior exposure to B. anthracis was the one with the riskiest patterns. A comparative study of variations in resource selection function (RSF) methods allowed for the development of general guidelines for the selection of available area given the movement patterns of the species. Species that have movements like central foragers or nomads benefit from using more broad area definitions like minimum convex polygons (MCP) while generating RSFs, while models built on territorial-like movements perform better with more modern and conservative methods such as localized convex hull (LoCoH) or potential path area (PPA). Finally, I created a model of anthrax in bison using an adaption of compartmental modeling systems to indirect disease transmission: (S)-Susceptible, (M)-Immune, (I)-Infected, (L)- Local infectious zone, (E)-Environment. I discovered that inclusion of a phenomenological factor to account for spatial aggregation improved model simulation behavior

    Bartonella-associated endothelial proliferation depends on inhibition of apoptosis

    No full text
    Bartonella is a Gram-negative pathogen that is unique among bacteria in being able to induce angioproliferative lesions. Cultured human endothelial cells have provided an in vitro system in which to study the basis of angioproliferation. Previous studies have attributed the organism's ability to induce angioproliferative lesions to direct mitotic stimulation of endothelial cells by these bacteria. Here we show that Bartonella inhibits apoptosis of endothelial cells in vitro, and that its ability to stimulate proliferation of endothelial cells depends to a large extent on its antiapoptotic activity. Bartonella suppresses both early and late events in apoptosis, namely caspase activation and DNA fragmentation, respectively. Its ability to inhibit death of endothelial cells after serum starvation can be recapitulated by media conditioned by bacteria, indicating that direct cell contact is not necessary. Among tested strains, the activity is produced only by Bartonella species that are significant human pathogens and are associated with angioproliferative lesions. We suggest that endothelial cells normally respond to infection by undergoing apoptosis and that Bartonella evolved the antiapoptotic activity to enhance survival of the host cells and therefore itself. We propose that Bartonella's antiapoptotic mechanism accounts at least in part for its ability to induce vascular proliferation in vivo

    Historical trends of degradation, loss, and recovery in the tropical forest reserves of Ghana

    No full text
    The Upper Guinean Forest region of West Africa, a globally significant biodiversity hotspot, is among the driest and most human-impacted tropical ecosystems. We used Landsat to study forest degradation, loss, and recovery in the forest reserves of Ghana from 2003 to 2019. Annual canopy cover maps were generated using random forests and results were temporally segmented using the LandTrendr algorithm. Canopy cover was predicted with a predicted-observed r2 of 0.76, mean absolute error of 12.8%, and mean error of 1.3%. Forest degradation, loss, and recovery were identified as transitions between closed (>60% cover), open (15–60% cover) and low tree cover (< 15% cover) classes. Change was relatively slow from 2003 to 2015, but there was more disturbance than recovery resulting in a gradual decline in closed canopy forests. In 2016, widespread fires associated with El Niño drought caused forest loss and degradation across more than 12% of the moist semi-deciduous and upland evergreen forest types. The workflow was implemented in Google Earth Engine, allowing stakeholders to visualize the results and download summaries. Information about historical disturbances will help to prioritize locations for future studies and target forest protection and restoration activities aimed at increasing resilience

    Decoupling environmental effects and host population dynamics for anthrax, a classic reservoir-driven disease.

    No full text
    Quantitative models describing environmentally-mediated disease transmission rarely focus on the independent contribution of recruitment and the environment on the force of infection driving outbreaks. In this study we attempt to investigate the interaction between external factors and host's population dynamics in determining the outbreaks of some indirectly transmitted diseases. We first built deterministic and stochastic compartmental models based on anthrax which were parameterized using information from literature and complemented with field observations. Our force of infection function was derived modeling the number of successful transmission encounters as a pure birth process that depends on the pathogen's dispersion effort. After accounting for individual heterogeneity in pathogen's dispersion effort, we allowed the force of infection to vary seasonally according to external factors recreating a scenario in which disease transmission increases in response to an environmental variable. Using simulations we demonstrate that anthrax disease dynamics in mid-latitude grasslands is decoupled from hosts population dynamics. When seasonal forcing was ignored, outbreaks matched hosts reproductive events, a scenario that is not realistic in nature. Instead, when allowing the force of infection to vary seasonally, outbreaks were only present in years were environmental variables were appropriate for the outbreaks to develop. We used the stochastic formulation of the force of infection to derive R0 under scenarios with different assumptions. The derivation of R0 allowed us to conclude that during epizootic years, pathogen contribution to disease persistence is nearly independent of dispersion. In endemic years, only pathogens with high dispersion significantly prevent disease extinction. Finally, we used our model in a maximum likelihood framework to estimate the parameters that determined a significant anthrax outbreak in Montana in 2008. Our study highlights the importance of the environment in determining anthrax outbreak intensity and could be useful to predict future events that could result in significant wildlife and domestic livestock losses

    Trend analysis of malaria in urban settings in Ethiopia from 2014 to 2019

    No full text
    Abstract Background Urbanization generally improves health outcomes of residents and is one of the potential factors that might contribute to reducing malaria transmission. However, the expansion of Anopheles stephensi, an urban malaria vector, poses a threat for malaria control and elimination efforts in Africa. In this paper, malaria trends in urban settings in Ethiopia from 2014 to 2019 are reported with a focus on towns and cities where An. stephensi surveys were conducted. Methods A retrospective study was conducted to determine malaria trends in urban districts using passive surveillance data collected at health facilities from 2014 to 2019. Data from 25 towns surveyed for An. stephensi were used in malaria trend analysis. Robust linear models were used to identify outliers and impute missing and anomalous data. The seasonal Mann-Kendal test was used to test for monotonic increasing or decreasing trends. Results A total of 9,468,970 malaria cases were reported between 2014 and 2019 through the Public Health Emergency Management (PHEM) system. Of these, 1.45 million (15.3%) cases were reported from urban settings. The incidence of malaria declined by 62% between 2014 and 2018. In 2019, the incidence increased to 15 per 1000 population from 11 to 1000 in 2018. Both confirmed (microscopy or RDT) Plasmodium falciparum (67%) and Plasmodium vivax (28%) were reported with a higher proportion of P. vivax infections in urban areas. In 2019, An. stephensi was detected in 17 towns where more than 19,804 malaria cases were reported, with most of the cases (56%) being P. falciparum. Trend analysis revealed that malaria cases increased in five towns in Afar and Somali administrative regions, decreased in nine towns, and had no obvious trend in the remaining three towns. Conclusion The contribution of malaria in urban settings is not negligible in Ethiopia. With the rapid expansion of An. stephensi in the country, the receptivity is likely to be higher for malaria. Although the evidence presented in this study does not demonstrate a direct linkage between An. stephensi detection and an increase in urban malaria throughout the country, An. stephensi might contribute to an increase in malaria unless control measures are implemented as soon as possible. Targeted surveillance and effective response are needed to assess the contribution of this vector to malaria transmission and curb potential outbreaks
    corecore