37 research outputs found

    Where to Forage When Afraid: Does Perceived Risk Impair Use of the Foodscape?

    Get PDF
    The availability and quality of forage on the landscape constitute the foodscape within which animals make behavioral decisions to acquire food. Novel changes to the foodscape, such as human disturbance, can alter behavioral decisions that favor avoidance of perceived risk over food acquisition. Although behavioral changes and population declines often coincide with the introduction of human disturbance, the link(s) between behavior and population trajectory are difficult to elucidate. To identify a pathway by which human disturbance may affect ungulate populations, we tested the Behaviorally Mediated Forage‐Loss Hypothesis, wherein behavioral avoidance is predicted to reduce use of available forage adjacent to disturbance. We used GPS collar data collected from migratory mule deer (Odocoileus hemionus) to evaluate habitat selection, movement patterns, and time‐budgeting behavior in response to varying levels of forage availability and human disturbance in three different populations exposed to a gradient of energy development. Subsequently, we linked animal behavior with measured use of forage relative to human disturbance, forage availability, and quality. Mule deer avoided human disturbance at both home range and winter range scales, but showed negligible differences in vigilance rates at the site level. Use of the primary winter forage, sagebrush (Artemisia tridentata), increased as production of new annual growth increased but use decreased with proximity to disturbance. Consequently, avoidance of human disturbance prompted loss of otherwise available forage, resulting in indirect habitat loss that was 4.6‐times greater than direct habitat loss from roads, well pads, and other infrastructure. The multiplicative effects of indirect habitat loss, as mediated by behavior, impaired use of the foodscape by reducing the amount of available forage for mule deer, a consequence of which may be winter ranges that support fewer animals than they did before development

    Estimating loss of Brucella abortus antibodies from age-specific serological data in elk

    Get PDF
    Serological data are one of the primary sources of information for disease monitoring in wildlife. However, the duration of the seropositive status of exposed individuals is almost always unknown for many free-ranging host species. Directly estimating rates of antibody loss typically requires difficult longitudinal sampling of individuals following seroconversion. Instead, we propose a Bayesian statistical approach linking age and serological data to a mechanistic epidemiological model to infer brucellosis infection, the probability of antibody loss, and recovery rates of elk (Cervus canadensis) in the Greater Yellowstone Ecosystem. We found that seroprevalence declined above the age of ten, with no evidence of disease-induced mortality. The probability of antibody loss was estimated to be 0.70 per year after a five-year period of seropositivity and the basic reproduction number for brucellosis to 2.13. Our results suggest that individuals are unlikely to become re-infected because models with this mechanism were unable to reproduce a significant decline in seroprevalence in older individuals. This study highlights the possible implications of antibody loss, which could bias our estimation of critical epidemiological parameters for wildlife disease management based on serological data
    corecore