32 research outputs found

    Genetic variation of Fraxinus excelsior half-sib families in response to ash dieback disease following simulated spring frost and summer drought treatments

    No full text
    Ten juvenile Fraxinus excelsior half-sib families from two Lithuanian populations have been tested in the controlled environment for their response to ash dieback disease caused by Hymenoscyphus fraxineus, detecting changes of genetic variation and heritability, as well as estimating genotype by environment (G×E) interaction and phenotypic plasticity following artificial spring frost and summer drought treatments. In 2014, a batch of 200 four-year-old ash seedlings was used for each treatment and control (no treatment). Health condition, bud flushing phenology and height were assessed for each seedling, and disease incidence and survival ratios were assessed for each family both before (at the beginning of the vegetation season) and after the treatments (at the end of the vegetation season). Disease incidence ratio increased from 0.77-0.80 up to 0.90-0.95. Tree mortality rates during one vegetation season were significantly lower in the frost treatment (21%) than in the drought treatment (25%) or control (31%). None of the tested F. excelsior families were completely resistant to ash dieback, although significant among-family differences in disease incidence and damage rates suggest an additive mode of gene action and thus a quantitative resistance to the disease. Neither disease incidence rates, nor tree health condition scores differed significantly among the applied treatments (including control) indicating in general a negligible effect of the simulated adverse conditions on health status of the ash seedlings. However, G×E interaction was found to be significant (at P > 0.05) for disease incidence, length of necrotic shoots and tree survival, implying that susceptibility of ash families to the dieback disease unequally depends on environmental conditions, and indicating a presence of genetic variation in plasticity and reaction norms of the tested families across environments (treatments). Substantially increased coefficients of additive genetic variation and heritability in health condition following both frost and drought treatments and compared to control showed that simulated stress conditions may noticeably contribute to expression of differences among the tested F. excelsior families in their resistance traits, thus enabling a better evaluation of performance of different families, an effective family selection for resistance, and achievement of a marked genetic gain

    Effects of season and region on sapstain and wood degrade following simulated storm damage in Pinus radiata plantations

    No full text
    Storms causing windthrow are major natural disturbance events and an unpredictable hazard to forest planning. Knowledge of regional and seasonal climatic effects on sapstain and decay fungi will allow forest managers to minimise losses from wood deterioration during salvage operations. A study was conducted monitoring sapstain in trees that were experimentally felled to simulate storm breakage at up to four times during the year in . Pinus radiata plantations across six locations in different climatic zones throughout New Zealand. It was found that drying of sapwood and development of sapstain depended more on the season when the storm occurred, rather than the time since felling. Sapstain appeared almost immediately in stems felled during summer, at some locations reaching more than 20% mean cross-sectional cover inside logs within 3. months, whereas in those felled during winter an initial lag phase during the cooler months preceded a more rapid rise during spring and summer. Rates varied substantially between locations with a tendency for faster deterioration where average temperatures were greater. For trees damaged during winter, it was predicted that a . P. . radiata butt log with a mid-length diameter of c. 16-23. cm will take from 2 to 8. months, depending on climate, to reach an economic damage benchmark threshold of 10% cross-sectional sapstain cover. However, for storms in spring or summer this period reduces to less than 1. month at warmer locations. Development of sapstain was uniform or increased slightly with height along the felled stem, but was greatest close to the felling cut in the basal section that would normally be removed during log retrieval. The results of this study provide new information about the temporal and regional variation in the dynamics of sapstain fungi that will assist forest managers during timber recovery following storms in regions with similar climates and tree species

    Adapting forest health assessments to changing perspectives on threats – a case example from Sweden

    Get PDF
    A revised Swedish forest health assessment system is presented. The assessment system is composed of several interacting components which target information needs for strategic and operational decision making and accommodate a continuously expanding knowledge base. The main motivation for separating information for strategic and operational decision making is that major damage outbreaks are often scattered throughout the landscape. Generally, large-scale inventories (such as national forest inventories) cannot provide adequate information for mitigation measures. In addition to broad monitoring programs that provide time-series information on known damaging agents and their effects, there is also a need for local and regional inventories adapted to specific damage events. While information for decision making is the major focus of the health assessment system, the system also contributes to expanding the knowledge base of forest conditions. For example, the integrated monitoring programs provide a better understanding of ecological processes linked to forest health. The new health assessment system should be able to respond to the need for quick and reliable information and thus will be an important part of the future monitoring of Swedish forests
    corecore