23 research outputs found

    Patterns of ash (Fraxinus excelsior L.) colonization in mountain grasslands: the importance of management practices

    Get PDF
    International audienceWoody colonization of grasslands is often associated with changes in abiotic or biotic conditions or a combination of both. Widely used as fodder and litter in the past traditional agro-pastoral system, ash (Fraxinus excelsior L.) has now become a colonizing species of mountain grasslands in the French Pyrenees. Its present distribution is dependent on past human activities and it is locally controlled by propagule pressure and abiotic conditions. However, even when all favourable conditions are met, all the potentially colonizable grasslands are not invaded. We hypothesize that management practices should play a crucial role in the control of ash colonization. From empirical field surveys we have compared the botanical composition of a set of grasslands (present and former) differing in management practices and level of ash colonization. We have displayed a kind of successional gradient positively linked to both ash cover and height but not to the age of trees. We have tested the relationships between ash presence in grassland and management types i.e. cutting and/or grazing, management intensity and some grassland communities' features i.e. total and local specific richness and species heterogeneity. Mixed use (cutting and grazing) is negatively linked to ash presence in grassland whereas grazing alone positively. Mixed use and high grazing intensity are directly preventing ash seedlings establishment, when low grazing intensity is allowing ash seedlings establishment indirectly through herbaceous vegetation neglected by livestock. Our results show the existence of a limit between grasslands with and without established ashes corresponding to a threshold in the intensity of use. Under this threshold, when ash is established, the colonization process seems to become irreversible. Ash possesses the ability of compensatory growth and therefore under a high grazing intensity develops a subterranean vegetative reproduction. However the question remains at which stage of seedling development and grazing intensity these strategies could occur

    Impacts of 1.5 versus 2.0°C on West African cereal yields

    Full text link
    With the Paris Agreement, governments around the world agreed to limit global temperature rise to less than 2.0°C above pre-industrial levels, with the ambition to keep warming to 1.5°C. Designing appropriate mitigation responses requires weighing costs of mitigating versus associated damages for the two levels of warming, with particular consideration of the implications for regions already challenged by food insecurity. This study assessed impacts in the West African Sudan Savanna of 1.5°C versus 2.0°C on yields of maize, pearl millet and sorghum. Two crop models were used that were calibrated with common varieties from experiments in the region. To capture a range of realistic management, early, typical and late sowing was assessed. Further, simulations were conducted for both current fertilizer rates and for an intensification case which assumed fertility not limiting, in attempt to capture the extremes of possible economic development scenarios on current cropping systems. With current fertilizer use, results indicated 2% units higher losses for maize and sorghum with 2.0°C compared to 1.5°C warming, with no change in millet yields for either scenario. In the intensification case, yield losses due to climate change were larger than with current fertilizer levels. However, despite the larger losses, yields were always 2-3 times higher with intensification, irrespective of the warming scenario. Though yield variability increased with intensification, there was no interaction with warming scenario. Risk and market analysis are needed to extend these results to understand implications for food security

    Effect of Intensification of Crop Management on Cereal Crop Yields under 1.5°C and 2.0°C Global Warming in the West African Sudan Savannah

    Full text link
    Rainfed cereals are the main staple food crops in the West African Sudan Savannah. While current yield levels are low due in large part to the limited use of fertilisers, sus-tainable intensification of cropping systems is widely promoted in the region to im-prove food security and drive regional economic development. However, an important consideration is increased internannual yield variability and possible interactions with climate change, as this represents an important source of risk for farmers. This study assessed the effect of intensification on maize, pearl millet and sorghum yields under 1.5°C and 2.0°C global warming in the West African Sudan Savannah. Simulations were conducted with two crop models (DSSAT and Lintul5 embedded into the SIM-PLACE modelling framework) at a spatial resolution of 0.25° under both current fer-tiliser use and optimum fertiliser application (intensification case). The models were calibrated with local varieties from field experiments in the region with management reflecting a range of typical sowing windows. Results indicated that yields simulated under intensification were two to three times higher than yields simulated under cur-rent fertiliser use, irrespective of the warming scenario. However, yield losses under climate change were slightly higher with intensification: 2 % units higher for maize and sorghum with 2.0°C compared to 1.5°C warming, with no change in millet yields for either scenario. As expected, interannual variability increased with intensifica-tion compared to current fertiliser use, though there was no interaction with climate change scenario. In summary, it is suggested that intensification would much more than offset the negative impacts of climate change, though economics analysis is re-quired to understand the implications for risk and constraints on market development needed to support intensification

    Angiographic versus functional severity of coronary artery stenoses in the FAME study: Fractional flow reserve versus angiography in multivessel evaluation

    No full text
    Background: It can be difficult to determine on the coronary angiogram which lesions cause ischemia. Revascularization of coronary stenoses that induce ischemia improves a patient's functional status and outcome. For stenoses that do not induce ischemia, however, the benefit of revascularization is less clear. Methods: In the FAME study, routine measurement of the fractional flow reserve (FFR) was compared with angiography for guiding percutaneous coronary intervention in patients with multivessel coronary artery disease. The use of the FFR in addition to angiography significantly reduced the rate of all major adverse cardiac events at 1 year. Of the 1,414 lesions (509 patients) in the FFR-guided arm of the FAME study, 1,329 were successfully assessed by the FFR and are included in this analysis. Results: Before FFR measurement, these lesions were categorized into 50% to 70% (47% of all lesions), 71% to 90% (39% of all lesions), and 91% to 99% (15% of all lesions) diameter stenosis by visual assessment. In the category 50% to 70% stenosis, 35% were functionally significant (FFR 0.80) and 65% were not (FFR >0.80). In the category 71% to 90% stenosis, 80% were functionally significant and 20% were not. In the category of subtotal stenoses, 96% were functionally significant. Of all 509 patients with angiographically defined multivessel disease, only 235 (46%) had functional multivessel disease (2 coronary arteries with an FFR 0.80). Conclusions: Angiography is inaccurate in assessing the functional significance of a coronary stenosis when compared with the FFR, not only in the 50% to 70% category but also in the 70% to 90% angiographic severity category

    Impact of early (<24h) versus delayed (>24h) intervention in patients with non ST segment elevation myocardial infarction (an observational study of 20882 patients)

    Full text link
    Abstract Background In patients presenting with non ST-segment elevation acute coronary syndromes (NSTE-ACS) an invasive approach has been shown to be superior to conservative management. Purpose We aimed to investigate the optimal timing of invasive coronary angiography and subsequent intervention. Methods We examined the impact ofearly (≤24h) versus delayed (&amp;gt;24h) intervention in a large observational cohort of 20882 consecutive patients with acute NSTE myocardial infarction (NSTEMI) treated with PCI between 2005 and 2015 at 9 tertiary cardiac centers in London (UK) using Cox-regression analysis and propensity matching. Results Mean age was 64.5±12.7 years and 26.1% were females. A quarter (27.6%), were treated within 24h.Patients treated within 24h were slightly younger (62.8±12.8 vs. 65.2±12.6, p&amp;lt;0.001), most commonly male (76% vs. 72.9%, p&amp;lt;0.001) and were more frequently ventilated (2.3% vs. 1.4%, p&amp;lt;0.001) and in cardiogenic shock (3.6% vs. 1.4%, p&amp;lt;0.001) with dynamic changes on their ECG (84.5% vs. 76.1% p&amp;lt;0.001). At a median follow up of 4.2 years (interquartile range 1.8 to 7) 17.7% of patients had died. Estimated 5-year survival in patients treated within 24h was 84.6% vs. 81% for those treated &amp;gt;24h following their presentation (p&amp;lt;0.001). This survival benefit remained following adjustment for confounders; HR (delayed vs. early management)1.11 (95% CI 1.003 to 1.23, p=0.046). In the propensity matched cohort of 4356 patients in each group, there remained a trend for higher survival in the early intervention group (p=0.061). Conclusions Notwithstanding the limitations of the retrospective design, this real-world cohort of NSTEMI patients suggests that an early intervention (≤24h) may improve mid term survival. Figure 1 Funding Acknowledgement Type of funding source: None </jats:sec
    corecore