892 research outputs found

    Impact of herbivores on nitrogen cycling:contrasting effects of small and large species

    Get PDF
    Herbivores are reported to slow down as well as enhance nutrient cycling in grasslands. These conflicting results may be explained by differences in herbivore type. In this study we focus on herbivore body size as a factor that causes differences in herbivore effects on N cycling. We used an exclosure set-up in a floodplain grassland grazed by cattle, rabbits and common voles, where we subsequently excluded cattle and rabbits. Exclusion of cattle lead to an increase in vole numbers and a 1.5-fold increase in net annual N mineralization at similar herbivore densities (corrected to metabolic weight). Timing and height of the mineralization peak in spring was the same in all treatments, but mineralization in the vole-grazed treatment showed a peak in autumn, when mineralization had already declined under cattle grazing. This mineralization peak in autumn coincides with a peak in vole density and high levels of N input through vole faeces at a fine-scale distribution, whereas under cattle grazing only a few patches receive all N and most experience net nutrient removal. The other parameters that we measured, which include potential N mineralization rates measured under standardized laboratory conditions and soil parameters, plant biomass and plant nutrient content measured in the field, were the same for all three grazing treatments and could therefore not cause the observed difference. When cows were excluded, more litter accumulated in the vegetation. The formation of this litter layer may have added to the higher mineralization rates under vole grazing, through enhanced nutrient return through litter or through modification of microclimate. We conclude that different-sized herbivores have different effects on N cycling within the same habitat. Exclusion of large herbivores resulted in increased N annual mineralization under small herbivore grazin

    The role of acute cortisol and DHEAS in predicting acute and chronic PTSD symptoms

    Get PDF
    Background: Decreased activation of the hypothalamus-pituitary-adrenal (HPA) axis in response to stress is suspected to be a vulnerability factor for posttraumatic stress disorder (PTSD). Previous studies showed inconsistent findings regarding the role of cortisol in predicting PTSD. In addition, no prospective studies have examined the role of dehydroepiandrosterone (DHEA), or its sulfate form DHEAS, and the cortisol-to-DHEA(S) ratio in predicting PTSD. In this study, we tested whether acute plasma cortisol, DHEAS and the cortisol-to-DHEAS ratio predicted PTSD symptoms at 6 weeks and 6 months post-trauma. Methods: Blood samples of 397 adult level-1 trauma center patients, taken at the trauma resuscitation room within hours after the injury, were analyzed for cortisol and DHEAS levels. PTSD symptoms were assessed at 6 weeks and 6 months post-trauma with the Clinician Administered PTSD Scale. Results: Multivariate linear regression analyses showed that lower cortisol predicted PTSD symptoms at both 6 weeks and 6 months, controlling for age, gender, time of blood sampling, injury, trauma history, and admission to intensive care. Higher DHEAS and a smaller cortisol-to-DHEAS ratio predicted PTSD symptoms at 6 weeks, but not after controlling for the same variables, and not at 6 months. Conclusions: Our study provides important new evidence on the crucial role of the HPA-axis in response to trauma by showing that acute cortisol and DHEAS levels predict PTSD symptoms in survivors of recent trauma. © 2014 Elsevier Ltd
    corecore