421 research outputs found

    runjags:an R package providing interface utilities, model templates, parallel computing methods and additional distributions for MCMC models in JAGS

    Get PDF
    The runjags package provides a set of interface functions to facilitate running Markov chain Monte Carlo models in JAGS from within R. Automated calculation of appropriate convergence and sample length diagnostics, user-friendly access to commonly used graphical outputs and summary statistics, and parallelized methods of running JAGS are provided. Template model specifications can be generated using a standard lme4-style formula interface to assist users less familiar with the BUGS syntax. Automated simulation study functions are implemented to facilitate model performance assessment, as well as drop-k type cross-validation studies, using high performance computing clusters such as those provided by parallel. A module extension for JAGS is also included within runjags, providing the Pareto family of distributions and a series of minimally-informative priors including the DuMouchel and half-Cauchy priors. This paper outlines the primary functions of this package, and gives an illustration of a simulation study to assess the sensitivity of two equivalent model formulations to different prior distributions

    Describing temporal variation in reticuloruminal pH using continuous monitoring data

    Get PDF
    Reticuloruminal pH has been linked to subclinical disease in dairy cattle, leading to considerable interest in identifying pH observations below a given threshold. The relatively recent availability of continuously monitored data from pH boluses gives new opportunities for characterizing the normal patterns of pH over time and distinguishing these from abnormal patterns using more sensitive and specific methods than simple thresholds. We fitted a series of statistical models to continuously monitored data from 93 animals on 13 farms to characterize normal variation within and between animals. We used a subset of the data to relate deviations from the normal pattern to the productivity of 24 dairy cows from a single herd. Our findings show substantial variation in pH characteristics between animals, although animals within the same farm tended to show more consistent patterns. There was strong evidence for a predictable diurnal variation in all animals, and up to 70% of the observed variation in pH could be explained using a simple statistical model. For the 24 animals with available production information, there was also a strong association between productivity (as measured by both milk yield and dry matter intake) and deviations from the expected diurnal pattern of pH 2 d before the productivity observation. In contrast, there was no association between productivity and the occurrence of observations below a threshold pH. We conclude that statistical models can be used to account for a substantial proportion of the observed variability in pH and that future work with continuously monitored pH data should focus on deviations from a predictable pattern rather than the frequency of observations below an arbitrary pH threshold

    A quantitative approach to improving the analysis of faecal worm egg count data

    Get PDF
    Analysis of Faecal Egg Count (FEC) and Faecal Egg Count Reduction Test (FECRT) datasets is frequently complicated by a high degree of variability between observations and relatively small sample sizes. In this thesis, statistical issues pertaining to the analysis of FEC and FECRT data are examined, and improved methods of analysis using Bayesian Markov chain Monte Carlo (MCMC) are developed. Simulated data were used to compare the accuracy of MCMC methods to existing maximum likelihood methods. The potential consequences of model selection based on empirical fit were also examined by comparing inference made from simulated data using different distributional assumptions. The novel methods were then applied to FEC data obtained from sheep and horses. Several syntactic variations of FECRT models were also developed, incorporating various different distributional assumptions including meta-population models. The inference made from simulated data and FECRT data taken from horses was compared to that made using the currently most widely used methods. Multi-level hierarchical models were then used to partition the source of the observed variability in FEC using data intensively sampled from a small group of horses. The MCMC methods out-performed other methods for analysis of simulated FEC and FECRT datasets, particularly in terms of the usefulness of 95% confidence intervals produced. There was no consistent difference in model fit to the gamma-Poisson or lognormal-Poison distributions from the available data. However there was evidence for the existence of bi-modality in the datasets. Although the majority of the observed variation in equine FEC is likely a consequence of variability between animals, a considerable proportion of the variability is due to the variability in true FEC between faecal piles and the aggregation of eggs on a local scale within faeces. The methods currently used for analysis of FEC and FECRT data perform poorly compared to MCMC methods, and produce 95% confidence intervals which are unreliable for datasets likely to be encountered in clinical parasitology. MCMC analysis is therefore to be preferred for these types of data, and also allows multiple samples taken from each animal to be incorporated into the analysis. Analysing the statistical processes underlying FEC data also revealed simple methods of reducing the observed variability, such as increasing the size of individual samples of faeces. Modelling the variability structure of FEC data, and use of the inferred parameter values in precision analysis and power analysis calculations, allows the usefulness of a study to be quantified before the data are collected. Given the difficulties with analysing FEC and FECRT data demonstrated, it is essential that such consideration of the statistical issues pertaining to the collection and analysis of such data is made for future parasitological studies

    Target Element Sizes For Finite Element Tidal Models From A Domain-wide, Localized Truncation Error Analysis Incorporating Botto

    Get PDF
    A new methodology for the determination of target element sizes for the construction of finite element meshes applicable to the simulation of tidal flow in coastal and oceanic domains is developed and tested. The methodology is consistent with the discrete physics of tidal flow, and includes the effects of bottom stress. The method enables the estimation of the localized truncation error of the nonconservative momentum equations throughout a triangulated data set of water surface elevation and flow velocity. The method\u27s domain-wide applicability is due in part to the formulation of a new localized truncation error estimator in terms of complex derivatives. More conventional criteria that are often used to determine target element sizes are limited to certain bathymetric conditions. The methodology developed herein is applicable over a broad range of bathymetric conditions, and can be implemented efficiently. Since the methodology permits the determination of target element size at points up to and including the coastal boundary, it is amenable to coastal domain applications including estuaries, embayments, and riverine systems. These applications require consideration of spatially varying bottom stress and advective terms, addressed herein. The new method, called LTEA-CD (localized truncation error analysis with complex derivatives), is applied to model solutions over the Western North Atlantic Tidal model domain (the bodies of water lying west of the 60° W meridian). The convergence properties of LTEACD are also analyzed. It is found that LTEA-CD may be used to build a series of meshes that produce converging solutions of the shallow water equations. An enhanced version of the new methodology, LTEA+CD (which accounts for locally variable bottom stress and Coriolis terms) is used to generate a mesh of the WNAT model domain having 25% fewer nodes and elements than an existing mesh upon which it is based; performance of the two meshes, in an average sense, is indistinguishable when considering elevation tidal signals. Finally, LTEA+CD is applied to the development of a mesh for the Loxahatchee River estuary; it is found that application of LTEA+CD provides a target element size distribution that, when implemented, outperforms a high-resolution semi-uniform mesh as well as a manually constructed, existing, documented mesh

    Systems approaches to animal disease surveillance and resource allocation: methodological frameworks for behavioral analysis

    Get PDF
    While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions

    Landscape Perceptions in the Lake District: Distant and Close Reading in Participatory GIS

    Get PDF
    As the use of Participatory Mapping becomes increasingly prevalent in decision making, it is vital to consider how analysis is conducted as well as data collection, in order to maximise the utility of the data that we collect from participants. This research explores the value in the free-text data that is commonly collected alongside participatory spatial data, but often overlooked or under-utilised. Here we use a case study in the Lake District National Park, UK to demonstrate how computational methods from literary research can provide a deeper understanding of participant’sspatial thoughts and feelings

    Paper2GIS: Going postal in the midst of a pandemic

    Get PDF
    It is widely agreed that using local knowledge and opinions can prove beneficial in the decision-making process, with various forms of Participatory Mapping being used to capture responses to spatial questions. However, remote participatory research is increasingly carried out using digital methods which can limit the involvement of those affected by digital divides. This research uses a novel, automatic self-digitising paper-based Participatory Mapping method to explore whether the accessibility needs of participants can be met whilst maintaining the potential for effective spatial analysis on the part of the researcher. As a paper-based, geographically specific approach this research could be conducted during the pandemic by post, with residents of the Outer Hebrides, UK
    • …
    corecore