22 research outputs found

    Field evaluation of tolerance to Tobacco streak virus in sunflower germplasm, and observations of seasonal disease spread

    Get PDF
    Strong statistical evidence was found for differences in tolerance to natural infections of Tobacco streak virus (TSV) in sunflower hybrids. Data from 470 plots involving 23 different sunflower hybrids tested in multiple trials over 5 years in Australia were analysed. Using a Bayesian Hierarchical Logistic Regression model for analysis provided: (i) a rigorous method for investigating the relative effects of hybrid, seasonal rainfall and proximity to inoculum source on the incidence of severe TSV disease; (ii) a natural method for estimating the probability distributions of disease incidence in different hybrids under historical rainfall conditions; and (iii) a method for undertaking all pairwise comparisons of disease incidence between hybrids whilst controlling the familywise error rate without any drastic reduction in statistical power. The tolerance identified in field trials was effective against the main TSV strain associated with disease outbreaks, TSV-parthenium. Glasshouse tests indicate this tolerance to also be effective against the other TSV strain found in central Queensland, TSV-crownbeard. The use of tolerant germplasm is critical to minimise the risk of TSV epidemics in sunflower in this region. We found strong statistical evidence that rainfall during the early growing months of March and April had a negative effect on the incidence of severe infection with greatly reduced disease incidence in years that had high rainfall during this period

    Statistical Estimation of Total Discharge Volumes

    No full text
    Abstract: Calculating the volumes of water discharged by streams is becoming increasingly important in water accounting and deciding how much water to allocate to competing uses. Water accounting is particularly important in Australia, as the driest inhabited continent and also in the face of potential impacts of a changing climate. Stream networks all over the world are littered with gauging stations, which take regular measurements of steam flow in order to help natural resource managers make decisions regarding water allocation. Estimating total discharge volumes is also of utmost importance when estimating pollutant loads from catchments. In order to calculate the total discharge volume, one must integrate the hydrograph (the graph of stream flow with time) over the period of interest. The simplest method to perform the integration is a trapezoidal scheme, however this fails to account for a number of sources of uncertainty inherent in the hydrograph, namely: (i) what happens between the discrete observations; (ii) gauging stations measure water height and flow is estimated using a rating curve between height and flow; and (iii) there are measurement errors associated with the height data recorded at gauging stations. We present a Monte Carlo method that employs: (i) nonparametric stochastic differential equations (SDEs) to bridge the gaps between discrete observations; and (ii) the Weighted Nadaraya-Watson estimator to estimate the conditional distribution of log-flow given water height. The output of the method is an ensemble of hydrographs that are faithful to the observed data, but incorporating these uncertainties/errors. Integrating the members of this ensemble gives rise to a distribution for the total discharge volumes and properly accounts for the imperfect information available. We demonstrate the methods using hydrographic data from Obi Obi Creek in the Mary River catchment, Queensland, Australia and examine the uncertainty inherent in total discharges when integrating over a single month and over an entire year. We also introduce an artificial gap of 375 days into the hydrograph and demonstrate how well our simulated diffusions replicate the dynamics of stream flow. Whilst our Monte Carlo method is useful for estimating total discharge volumes, the nonparametric SDEs used also appear to have good potential as stochastic rainfall-runoff models in their own right

    Statistical estimation of total discharge volumes

    No full text
    Calculating the volumes of water discharged by streams is becoming increasingly important in water accounting and deciding how much water to allocate to competing uses. Water accounting is particularly important in Australia, as the driest inhabited continent and also in the face of potential impacts of a changing climate. Stream networks all over the world are littered with gauging stations, which take regular measurements of steam flow in order to help natural resource managers make decisions regarding water allocation. Estimating total discharge volumes is also of utmost importance when estimating pollutant loads from catchments. In order to calculate the total discharge volume, one must integrate the hydrograph (the graph of stream flow with time) over the period of interest. The simplest method to perform the integration is a trapezoidal scheme, however this fails to account for a number of sources of uncertainty inherent in the hydrograph, namely: (i) what happens between the discrete observations; (ii) gauging stations measure water height and flow is estimated using a rating curve between height and flow; and (iii) there are measurement errors associated with the height data recorded at gauging stations. We present a Monte Carlo method that employs: (i) nonparametric stochastic differential equations (SDEs) to bridge the gaps between discrete observations; and (ii) the Weighted Nadaraya-Watson estimator to estimate the conditional distribution of log-flow given water height. The output of the method is an ensemble of hydrographs that are faithful to the observed data, but incorporating these uncertainties/errors. Integrating the members of this ensemble gives rise to a distribution for the total discharge volumes and properly accounts for the imperfect information available. We demonstrate the methods using hydrographic data from Obi Obi Creek in the Mary River catchment, Queensland, Australia and examine the uncertainty inherent in total discharges when integrating over a single month and over an entire year. We also introduce an artificial gap of 375 days into the hydrograph and demonstrate how well our simulated diffusions replicate the dynamics of stream flow. Whilst our Monte Carlo method is useful for estimating total discharge volumes, the nonparametric SDEs used also appear to have good potential as stochastic rainfall-runoff models in their own right

    Optimal design of experimental epidemics

    No full text
    We consider the optimal design of controlled experimental epidemics or transmission experiments, whose purpose is to inform the practitioner about disease transmission and recovery rates. Our methodology employs Gaussian diffusion approximations, applicable to epidemics that can be modeled as density-dependent Markov processes and involving relatively large numbers of organisms. We focus on finding (i) the optimal times at which to collect data about the state of the system for a small number of discrete observations, (ii) the optimal numbers of susceptible and infective individuals to begin an experiment with, and (iii) the optimal number of replicate epidemics to use. We adopt the popular D-optimality criterion as providing an appropriate objective function for designing our experiments, since this leads to estimates with maximum precision, subject to valid assumptions about parameter values. We demonstrate the broad applicability of our methodology using a diverse array of compartmental epidemic models: a time-homogeneous SIS epidemic, a time-inhomogeneous SI epidemic with exponentially decreasing transmission rates and a partially observed SIR epidemic where the infectious period for an individual has a gamma distribution

    Optimal sampling and problematic likelihood functions in a simple population model

    No full text
    Markov chains provide excellent statistical models for studying many natural phenomena that evolve with time. One particular class of continuous-time Markov chain, called birth-death processes, can be used for modelling population dynamics in fields such as ecology and microbiology. The challenge for the practitioner when fitting these models is to take measurements of a population size over time in order to estimate the model parameters, such as per capita birth and death rates. In many biological contexts, it is impractical to follow the fate of each individual in a population continuously in time, so the researcher is often limited to a fixed number of measurements of population size over the duration of the study. We show that for a simple birth-death process, with positive Malthusian growth rate, subject to common practical constraints (such as the number of samples and timeframes), there is an optimal schedule for measuring the population size that minimises the expected confidence region of the parameter estimates. This type of experimental design results in a more efficient use of experimental resources, which is often an important consideration

    Streamflow rating uncertainty: Characterisation and impacts on model calibration and performance

    No full text
    Common streamflow gauging procedures require assumptions about the stage-discharge relationship (the 'rating curve') that can introduce considerable uncertainties in streamflow records. These rating uncertainties are not usually considered fully in hydrological model calibration and evaluation yet can have potentially important impacts. We analysed streamflow gauge data and conducted two modelling experiments to assess rating uncertainty in operational rating curves, its impacts on modelling and possible ways to reduce those impacts. We found clear evidence of variance heterogeneity (heteroscedasticity) in streamflow estimates, with higher residual values at higher stage values. In addition, we confirmed the occurrence of streamflow extrapolation beyond the highest or lowest stage measurement in many operational rating curves, even when these were previously flagged as not extrapolated. The first experiment investigated the impact on regional calibration/evaluation of: (i) using two streamflow data transformations (logarithmic and square-root), compared to using non-transformed streamflow data, in an attempt to reduce heteroscedasticity and; (ii) censoring the extrapolated flows, compared to no censoring. Results of calibration/evaluation showed that using a square-root transformed streamflow (thus, compromising weight on high and low streamflow) performed better than using non-transformed and log-transformed streamflow. Also, surprisingly, censoring extrapolated streamflow reduced rather than improved model performance. The second experiment investigated the impact of rating curve uncertainty on catchment calibration/evaluation and parameter estimation. A Monte-Carlo approach and the nonparametric Weighted Nadaraya-Watson (WNW) estimator were used to derive streamflow uncertainty bounds. These were later used in calibration/evaluation using a standard Nash-Sutcliffe Efficiency (NSE) objective function (OBJ) and a modified NSE OBJ that penalised uncertain flows. Usingsquare-root transformed flows and the modified NSE OBJ considerably improved calibration and predictions, particularly for mid and low flows, and there was an overall reduction in parameter uncertainty

    A bead-based suspension array for the multiplexed detection of begomoviruses and their whitefly vectors

    No full text
    Bead-based suspension array systems enable simultaneous fluorescence-based identification of multiple nucleic acid targets in a single reaction. This study describes the development of a novel approach to plant virus and vector diagnostics, a multiplexed 7-plex array that comprises a hierarchical set of assays for the simultaneous detection of begomoviruses and Bemisia tabaci, from both plant and whitefly samples. The multiplexed array incorporates genus, species and strain-specific assays, offering a unique approach for identifying both known and unknown viruses and B. tabaci species. When tested against a large panel of sequence-characterized begomovirus and whitefly samples, the array was shown to be 100% specific to the homologous target. Additionally, the multiplexed array was highly sensitive, efficiently and concurrently determining both virus and whitefly identity from single viruliferous whitefly samples. The detection limit for one assay within the multiplexed array that specifically detects Tomato yellow leaf curl virus-Israel (TYLCV-IL) was quantified as 200 fg of TYLCV-IL DNA, directly equivalent to that of TYLCVspecific qPCR. Highly reproducible results were obtained over multiple tests. The flexible multiplexed array described in this study has great potential for use in plant quarantine, biosecurity and disease management programs worldwide. (C) 2014 Elsevier B.V. All rights reserved

    Spatio-temporal assimilation of modelled catchment loads with monitoring data in the Great Barrier Reef

    No full text
    Soil erosion and sediment transport into waterways and the ocean can adversely affect water clarity, leading to the deterioration of marine ecosystems such as the iconic Great Barrier Reef (GBR) in Australia. Quantifying a sediment load and its associated uncertainty is an important task in delineating how changes in management practices can contribute to improvements in water quality, and therefore continued sustainability of the GBR. However, monitoring data are spatially (and often temporally) sparse, making load estimation complicated, particularly when there are lengthy periods between sampling or during peak flow periods of major events when samples cannot be safely taken.\ud \ud We develop a spatio-temporal statistical model that is mechanistically motivated by a process-based deterministic model called Dynamic SedNet. The model is developed within a Bayesian hierarchical modelling framework that uses dimension reduction to accommodate seasonal and spatial patterns to assimilate monitored sediment concentration and flow data with output from Dynamic SedNet. The approach is applied in the Upper Burdekin catchment in Queensland, Australia, where we obtain daily estimates of sediment concentrations, stream discharge volumes and sediment loads at 411 spatial locations across 20 years. Our approach provides a method for assimilating both monitoring data and modelled output, providing a statistically rigorous method for quantifying uncertainty through space and time that was previously unavailable through process-based models
    corecore