34 research outputs found

    Field evaluation of tolerance to Tobacco streak virus in sunflower germplasm, and observations of seasonal disease spread

    Get PDF
    Strong statistical evidence was found for differences in tolerance to natural infections of Tobacco streak virus (TSV) in sunflower hybrids. Data from 470 plots involving 23 different sunflower hybrids tested in multiple trials over 5 years in Australia were analysed. Using a Bayesian Hierarchical Logistic Regression model for analysis provided: (i) a rigorous method for investigating the relative effects of hybrid, seasonal rainfall and proximity to inoculum source on the incidence of severe TSV disease; (ii) a natural method for estimating the probability distributions of disease incidence in different hybrids under historical rainfall conditions; and (iii) a method for undertaking all pairwise comparisons of disease incidence between hybrids whilst controlling the familywise error rate without any drastic reduction in statistical power. The tolerance identified in field trials was effective against the main TSV strain associated with disease outbreaks, TSV-parthenium. Glasshouse tests indicate this tolerance to also be effective against the other TSV strain found in central Queensland, TSV-crownbeard. The use of tolerant germplasm is critical to minimise the risk of TSV epidemics in sunflower in this region. We found strong statistical evidence that rainfall during the early growing months of March and April had a negative effect on the incidence of severe infection with greatly reduced disease incidence in years that had high rainfall during this period

    Re-thinking soil carbon modelling: a stochastic approach to quantify uncertainties

    Get PDF
    The benefits of sequestering carbon are many, including improved crop productivity, reductions in greenhouse gases, and financial gains through the sale of carbon credits. Achieving better understanding of the sequestration process has motivated many deterministic models of soil carbon dynamics, but none of these models addresses uncertainty in a comprehensive manner. Uncertainty arises in many ways - around the model inputs, parameters, and dynamics, and subsequently model predictions. In this paper, these uncertainties are addressed in concert by incorporating a physical-statistical model for carbon dynamics within a Bayesian hierarchical modelling framework. This comprehensive approach to accounting for uncertainty in soil carbon modelling has not been attempted previously. This paper demonstrates proof-of-concept based on a one-pool model and identifies requirements for extension to multi-pool carbon modelling. Our model is based on the soil carbon dynamics in Tarlee, South Australia. We specify the model conditionally through its parameters, soil carbon input and decay processes, and observations of those processes. We use a particle marginal Metropolis-Hastings approach specified using the LibBi modelling language. We highlight how samples from the posterior distribution can be used to summarise our knowledge about model parameters, to estimate the probabilities of sequestering carbon, and to forecast changes in carbon stocks under crop rotations not represented explicitly in the original field trials

    Optimal design of measurements on queueing systems

    Get PDF
    We examine the optimal design of measurements on queues with particular reference to the M/M/1 queue. Using the statistical theory of design of experiments, we calculate numerically the Fisher information matrix for an estimator of the arrival rate and the service rate to find optimal times to measure the queue when the number of measurements are limited for both interfering and non-interfering measurements. We prove that in the non-interfering case, the optimal design is equally spaced. For the interfering case, optimal designs are not necessarily equally spaced. We compute optimal designs for a variety of queuing situations and give results obtained under the D−D-- and DsD_s-optimality criteria

    Optimal use of GPS transmitter for estimating species migration rate

    No full text
    A common approach to learning about species movements is to tag individuals with a GPS transmitter. Here we provide methodology which determines the optimal programming of the times for such a device, and in doing so allow an assessment of the benefit provided over equidistant sampling schedules. We provide an algorithm (and MATLAB code) that computes the optimal patch in which to tag an individual, in addition to the optimal timing and number of samples in order to best estimate three parameters describing species-habitat migration rate (assuming a common form of migration). We use this algorithm to identify some basic conditions of a network that ensure identifiability of model parameters: at least four distinct inter-patch distances. We subsequently apply our algorithm to a number of randomly-generated networks, and demonstrate the efficiency gains from optimising various components of the sampling schedule. Finally, we determine the optimal sampling schedule for a real network: the spotted owl (Strix occidentalis occidentalis) in Southern California. The comparison of random and real networks demonstrates the improvement in efficiency as the size and heterogeneity of the underlying network increases. We hope this methodology is used to make better use of existing technologies, leading to improved understanding and conservation of species. © 2012 Elsevier B.V.D.E. Pagendam, J.V. Ros

    Statistical Estimation of Total Discharge Volumes

    No full text
    Abstract: Calculating the volumes of water discharged by streams is becoming increasingly important in water accounting and deciding how much water to allocate to competing uses. Water accounting is particularly important in Australia, as the driest inhabited continent and also in the face of potential impacts of a changing climate. Stream networks all over the world are littered with gauging stations, which take regular measurements of steam flow in order to help natural resource managers make decisions regarding water allocation. Estimating total discharge volumes is also of utmost importance when estimating pollutant loads from catchments. In order to calculate the total discharge volume, one must integrate the hydrograph (the graph of stream flow with time) over the period of interest. The simplest method to perform the integration is a trapezoidal scheme, however this fails to account for a number of sources of uncertainty inherent in the hydrograph, namely: (i) what happens between the discrete observations; (ii) gauging stations measure water height and flow is estimated using a rating curve between height and flow; and (iii) there are measurement errors associated with the height data recorded at gauging stations. We present a Monte Carlo method that employs: (i) nonparametric stochastic differential equations (SDEs) to bridge the gaps between discrete observations; and (ii) the Weighted Nadaraya-Watson estimator to estimate the conditional distribution of log-flow given water height. The output of the method is an ensemble of hydrographs that are faithful to the observed data, but incorporating these uncertainties/errors. Integrating the members of this ensemble gives rise to a distribution for the total discharge volumes and properly accounts for the imperfect information available. We demonstrate the methods using hydrographic data from Obi Obi Creek in the Mary River catchment, Queensland, Australia and examine the uncertainty inherent in total discharges when integrating over a single month and over an entire year. We also introduce an artificial gap of 375 days into the hydrograph and demonstrate how well our simulated diffusions replicate the dynamics of stream flow. Whilst our Monte Carlo method is useful for estimating total discharge volumes, the nonparametric SDEs used also appear to have good potential as stochastic rainfall-runoff models in their own right

    Optimal GPS tracking for estimating species movements

    No full text
    A common approach to learning about species movements is to tag individuals with a GPS transmitter. Here we provide methodology which determines the optimal programming of the times for such a device, and in doing so allow an assessment of the benefit provided over equidistant sampling schedules. We provide an algorithm (and MATLAB code1) that computes the optimal patch in which to tag an individual, in addition to the optimal timing and number of samples in order to best estimate three parameters describing the species-habitat migration rate (assuming a common form of migration). We use this algorithm to identify some basic conditions of a network that ensure identifiability of model parameters: at least four distinct inter-patch distances. We subsequently apply our algorithm to a number of randomly-generated networks, and demonstrate the efficiency gains from optimising various components of the sampling schedule. Finally, we determine the optimal sampling schedule for a real network: the spotted owl (Strix occidentalis occidentalis) in Southern California (Lahaye et al., 1994; Shuford and Gardali (editors), 2008). The comparison of random and real networks demonstrates the improvement in efficiency as the size and heterogeneity of the underlying network increases. This is believed to be the first methodology to determine the optimal design for monitoring species movements. Our study also differs from previous optimal design methodology for stochastic models in that we evaluate the Fisher Information Matrix exactly (to computational precision) rather than adopting an approximation (Pagendam and Pollett, 2009, 2010b). Furthermore, we provide code to implement EID- optimality, which more naturally aligns with the motivation of classical D-optimality, but in the situation of prior uncertainties on parameter values as is common to the problems of interest to us here (Walter and Pronzato, 1987).D. E. Pagendam and J. V. Rosshttp://www.scopus.com/inward/record.url?eid=2-s2.0-84858826031&partnerID=40&md5=6b6c76445bb75ca8ec58878c93eab72

    Visual comparison of spatial patterns of annual suspended sediment loads estimated by two water quality modelling approaches

    No full text
    The Queensland Department of Environment and Resource Management is using the SedNet and E2 water quality modelling approaches to support government policy and natural resource managers in improving water quality. SedNet is designed to determine the long-term average annual sediment load, and does not deal with temporal variability. It includes hillslope erosion, gully erosion, and riverbank erosion, which enables land managers to undertake on ground works in areas of the landscape that generate disproportionate quantities of sediment. E2 is a daily time step model capable of modelling temporal variability in water quality as a result of management and/or climate changes. However, hillslope erosion, gully erosion, and riverbank erosion are not currently explicitly represented in E2 in which sediment generation is based on the concept of Event Mean Concentration (EMC) and Dry Weather Concentration (DWC) with user assigned values depending on factors such as land use, soil type, and topography. As a modelling framework, E2 is capable of housing alternative models for the same process. Both SedNet and E2 modelling approaches are based on node-link configuration of the stream network generated from pitfilled digital elevation models. This configuration allows the user to determine outputs from either model at any point of interest within the catchment

    Statistical estimation of total discharge volumes

    No full text
    Calculating the volumes of water discharged by streams is becoming increasingly important in water accounting and deciding how much water to allocate to competing uses. Water accounting is particularly important in Australia, as the driest inhabited continent and also in the face of potential impacts of a changing climate. Stream networks all over the world are littered with gauging stations, which take regular measurements of steam flow in order to help natural resource managers make decisions regarding water allocation. Estimating total discharge volumes is also of utmost importance when estimating pollutant loads from catchments. In order to calculate the total discharge volume, one must integrate the hydrograph (the graph of stream flow with time) over the period of interest. The simplest method to perform the integration is a trapezoidal scheme, however this fails to account for a number of sources of uncertainty inherent in the hydrograph, namely: (i) what happens between the discrete observations; (ii) gauging stations measure water height and flow is estimated using a rating curve between height and flow; and (iii) there are measurement errors associated with the height data recorded at gauging stations. We present a Monte Carlo method that employs: (i) nonparametric stochastic differential equations (SDEs) to bridge the gaps between discrete observations; and (ii) the Weighted Nadaraya-Watson estimator to estimate the conditional distribution of log-flow given water height. The output of the method is an ensemble of hydrographs that are faithful to the observed data, but incorporating these uncertainties/errors. Integrating the members of this ensemble gives rise to a distribution for the total discharge volumes and properly accounts for the imperfect information available. We demonstrate the methods using hydrographic data from Obi Obi Creek in the Mary River catchment, Queensland, Australia and examine the uncertainty inherent in total discharges when integrating over a single month and over an entire year. We also introduce an artificial gap of 375 days into the hydrograph and demonstrate how well our simulated diffusions replicate the dynamics of stream flow. Whilst our Monte Carlo method is useful for estimating total discharge volumes, the nonparametric SDEs used also appear to have good potential as stochastic rainfall-runoff models in their own right

    Optimal design of experimental epidemics

    No full text
    We consider the optimal design of controlled experimental epidemics or transmission experiments, whose purpose is to inform the practitioner about disease transmission and recovery rates. Our methodology employs Gaussian diffusion approximations, applicable to epidemics that can be modeled as density-dependent Markov processes and involving relatively large numbers of organisms. We focus on finding (i) the optimal times at which to collect data about the state of the system for a small number of discrete observations, (ii) the optimal numbers of susceptible and infective individuals to begin an experiment with, and (iii) the optimal number of replicate epidemics to use. We adopt the popular D-optimality criterion as providing an appropriate objective function for designing our experiments, since this leads to estimates with maximum precision, subject to valid assumptions about parameter values. We demonstrate the broad applicability of our methodology using a diverse array of compartmental epidemic models: a time-homogeneous SIS epidemic, a time-inhomogeneous SI epidemic with exponentially decreasing transmission rates and a partially observed SIR epidemic where the infectious period for an individual has a gamma distribution

    On parameter estimation in population models II: multi-dimensional processes and transient dynamics

    No full text
    Recently, a computationally-efficient method was presented for calibrating a wide-class of Markov processes from discrete-sampled abundance data. The method was illustrated with respect to one-dimensional processes and required the assumption of stationarity. Here we demonstrate that the approach may be directly extended to multi-dimensional processes, and two analogous computationally-efficient methods for non-stationary processes are developed. These methods are illustrated with respect to disease and population models, including application to infectious count data from an outbreak of "Russian influenza" (A/USSR/1977 H1N1) in an educational institution. The methodology is also shown to provide an efficient, simple and yet rigorous approach to calibrating disease processes with gamma-distributed infectious period.J. V. Ross, D. E. Pagendam and P. K. Pollet
    corecore