21 research outputs found

    Optimal management strategies for a cascade reservoir system

    Get PDF
    Water reservoirs have long been used throughout Australia and the globe as a means of both providing communities with water security during periods of limited rainfall, as well as a form of defence against severe flooding. In recent times, the effective management of these water reservoirs has been questioned and is now, more than ever, under scrutiny. In order to address the issue of reservoir mismanagement, this thesis demonstrates the methods and procedures undertaken in the development, formulation and application of two Mixed Integer Linear Programming (MILP) models that have the ability to determine strategies for the optimal management of a cascade reservoir system, under the two extreme environmental conditions of drought and flood. For the purposes of this thesis, the unique cascade conguration of a reservoir system was primarily considered; where cascade refers to a multiple reservoir system in which the spill from earlier reservoirs becomes a source of inflows to subsequent reservoirs. Many physical reservoir systems exhibit this type of layout including the Perseverance and Cressbrook system located near Toowoomba, which has been considered as a case study throughout this thesis. By applying the drought and flood models to the case study of the Perseverance and Cressbrook cascade reservoir system, it was found that both models provided com- prehensive approximations of the system behaviours under the differing extreme conditions considered by each model. However, in order to conduct a successful comparison of the management strategies employed by the drought and flood models, a common set of inflow records upon which both models could be considered was required. Rather than using a portion of the historic inflow records sourced for the case study considered, time series analysis was employed instead to select a time series model that suitably represented the historic records, and then from this model, an alternate set of inflows was simulated. Using the simulated set of inflows, a comparison of the management strategies employed by the two MILP models for a drought and flood was conducted; demonstrating both similarities and differences between the optimal strategies employed for the management of the cascade reservoir system. The comparison also revealed that although 'common sense' practices could be employed to operate the cascade reservoir system, these practices were not optimal and thus did not result in the effective management of the system. Therefore, models like those developed, formulated, and utilised in this thesis are necessary to ensure that a commodity as heavily relied upon and sometimes as potentially dangerous as water is optimally governed and regulated into the future

    Using yield response curves to measure variation in the tolerance and resistance of wheat cultivars to Fusarium crown rot

    Get PDF
    The disease crown rot, caused predominantly by the fungal pathogen Fusarium pseudograminearum (Fp), is a major disease of winter cereals in many regions of the world, including Australia. A methodology is proposed, using response curves, to robustly estimate the relationship between grain yield and increasing crown rot pathogen burdens. Using data from a field experiment conducted in northern New South Wales, Australia in 2016, response curves were derived for five commercial wheat cultivars exposed to six increasing rates of crown rot inoculum, where the rates served to establish a range of crown rot pathogen burdens. In this way, the response curve methodology is fundamentally different from alternate approaches that rely on genetic or environmental variation to establish a range in pathogen burdens over which yield loss relationships are estimated. By manipulating only the rates of crown rot inoculum and thus pathogen burden directly, the number of additional confounding factors and interactions are minimised, enabling the robust estimation of the rate of change in yield due to increasing crown rot pathogen burdens for each cultivar. The methodology revealed variation in the rate of change in yield between cultivars, along with the extent of crown rot symptoms expressed by the cultivars. Variation in the rate of change in yield between cultivars provides definitive evidence of differences in the tolerance of commercial Australian wheat cultivars to crown rot caused by Fp, while variation in the extent of crown rot symptoms signifies differences in the resistance of the cultivars to this disease. The response curve methodology also revealed variation in how the different mechanisms of tolerance and resistance act to limit yield losses due to crown rot for different cultivars

    Stubble Olympics: the cereal pathogen 10cm sprint – growth patterns of fungi causing crown rot, common root rot and yellow leaf spot in post-harvest cereal stubble

    Get PDF
    Take home messages • Wetter is better (for cereal pathogens): moist conditions promoted growth of pathogenic fungi (Fusarium pseudograminearum, Bipolaris sorokiniana and Pyrenophora tritici-repentis) within post-harvest cereal stubble, meaning inoculum levels of crown rot, common root rot and yellow spot may increase if wet weather is experienced after harvest • Not all cereal stubble is created equally: some pathogens progressed further in oat than bread wheat stubble. Additionally, there are indications that the resistance ratings of varieties and crops do not reflect the extent of saprophytic growth post-harvest • Each cereal pathogen had a unique stubble-colonisation pattern: the crown rot fungus was the quickest to progress within all stubble types and the yellow spot pathogen was the slowest. This is likely to influence which pathogen dominates in following seasons if mixed infections have occurred in the same crop • Reducing cereal stubble biomass may limit the post-harvest progression of pathogenic fungi in stubble, thereby reducing the amount of inoculum carried forward. Options could include selection of low-biomass varieties, low harvest heights or cutting for hay, however field validation is required

    “Killer Joules”: spores of Bipolaris sorokiniana and fusarium species are susceptible to microwave radiation

    Get PDF
    Cereal production in Australia is severely impacted by diseases such as Fusarium crown rot (caused predominantly by Fusarium pseudograminearum) and common root rot (caused by Bipolaris sorokiniana). These diseases are particularly difficult to manage because inoculum can survive at least three years within cereal stubble, or four years in soil in the case of B. sorokiniana. Microwave radiation may be able to reduce or eliminate inoculum within stubble and soil. Several cereal pathogens have been previously shown to be susceptible to microwave radiation, but the energy requirements to achieve a significant decrease in pathogen populations were not defined. Laboratory based microwave dose-response experiments on conidia of B. sorokiniana and macroconidia of F. pseudograminearum and F. cerealis revealed that all three pathogens are susceptible to microwave radiation, with lethal dose (LD) thresholds estimated for each pathogen. Bipolaris sorokiniana conidia required 103.8 Jg− 1 and 236.6 Jg− 1 of microwave radiation energy for LD50 and LD99, respectively, whilst F. pseudograminearum required 78.4 Jg− 1 and 300.8 Jg− 1 and F. cerealis required 95.3 Jg− 1 and 152.7 Jg− 1 for LD50 and LD99, respectively. These results were derived from experiments whereby samples were microwaved for up to 10 s using a domestic 1100 W microwave oven. These timing and energy requirements serve as a starting point to define requirements for further development of microwave radiation treatments under field conditions

    Is there a disease downside to stripper fronts? Harvest height implications for Fusarium crown rot management

    Get PDF
    Take home messages • Taller standing stubble allowed vertical progression of the Fusarium crown rot fungus within the stubble after harvest, whilst short stubble prevented further growth (i.e. vertical growth was limited to the height of the cut stubble). • Stripper fronts, which leave higher standing stubble, may increase stubble-borne disease inoculum after harvest of an infected crop, especially if wet fallow conditions are experienced. • In high-risk situations, such as an infected crop with high biomass, cutting the crop shorter at harvest will limit further inoculum development within the stubble after harvest (beyond the levels already present at harvest). • Cutting infected cereal stubble shorter prior to rotation with shorter-stature crops such as chickpea or lentils also prevents the dispersal of infected stubble when harvesting these shorter break crops

    Stubble trouble! Moisture, pathogen fitness and cereal type drive colonisation of cereal stubble by three fungal pathogens

    Get PDF
    Stubble-borne cereal diseases are a major constraint to production in Australia, with associated costs rising as a result of increased adoption of conservation agriculture systems. The fungal pathogens that cause these diseases can saprotrophically colonise retained cereal residues, which may further increase inoculum levels post-harvest. Hence, saprotrophic colonisation by the stubble-borne fungal pathogens Fusarium pseudograminearum, Pyrenophora tritici-repentis and Bipolaris sorokiniana were compared under a range of moisture conditions for stubble of six cereal varieties (two bread wheat, two barley, one durum wheat and one oat). Sterile cereal stubble was inoculated separately with two isolates of each pathogen and placed, standing, under constant relative humidity conditions (90, 92.5, 95, 97.5 and 100%) for 7 days at 25 °C. Stubble was then cultured in increments of 1 cm to determine the percentage colonisation height of each tiller. Fusarium pseudograminearum colonised farther within tillers, leaving a greater proportion of the standing stubble colonised compared with B. sorokiniana and P. tritici-repentis, suggesting F. pseudograminearum has higher saprotrophic fitness. Saprotrophic colonisation also increased with increasing relative humidity for all pathogens and varied by cereal type. Disease management strategies, such as reduced cereal harvest height, may limit saprotrophic colonisation and improve stubble-borne disease management in conservation agriculture systems

    The secret life of crown rot: what happens after harvest?

    No full text
    Take home messages A preliminary survey of cereal stubble from 2017 showed that in the northern region (NSW and Qld) the crown rot fungus is commonly present from the crown up to 18 cm, with detection up to 33 cm within tillers at harvest. However, moist conditions can promote further growth of the crown rot fungus post-harvest in inoculated cereal stubble (increasing by almost 1 cm up from the crown per day at 100% humidity). Inoculum levels in post-harvest stubble are not static and may fluctuate as different weather patterns are experienced. Planting different bread wheat, durum wheat and barley varieties may not be useful for supressing inoculum growth in stubble after harvest. Reducing cereal stubble height may limit inoculum build-up in crown rot affected paddocks by restricting the capacity for further fungal growth post-harvest. This could also help reduce dispersal of infected residues when harvesting shorter break crops such as chickpea, but field validation of this management option is required
    corecore