329 research outputs found

    Influence on disease spread dynamics of herd characteristics in a structured livestock industry

    Get PDF
    Studies of between-herd contacts may provide important insight to disease transmission dynamics. By comparing the result from models with different levels of detail in the description of animal movement, we studied how factors influence the final epidemic size as well as the dynamic behaviour of an outbreak. We investigated the effect of contact heterogeneity of pig herds in Sweden due to herd size, between-herd distance and production type. Our comparative study suggests that the production-type structure is the most influential factor. Hence, our results imply that production type is the most important factor to obtain valid data for and include when modelling and analysing this system. The study also revealed that all included factors reduce the final epidemic size and also have yet more diverse effects on initial rate of disease spread. This implies that a large set of factors ought to be included to assess relevant predictions when modelling disease spread between herds. Furthermore, our results show that a more detailed model changes predictions regarding the variability in the outbreak dynamics and conclude that this is an important factor to consider in risk assessment

    Tests of Bayesian Model Selection Techniques for Gravitational Wave Astronomy

    Full text link
    The analysis of gravitational wave data involves many model selection problems. The most important example is the detection problem of selecting between the data being consistent with instrument noise alone, or instrument noise and a gravitational wave signal. The analysis of data from ground based gravitational wave detectors is mostly conducted using classical statistics, and methods such as the Neyman-Pearson criteria are used for model selection. Future space based detectors, such as the \emph{Laser Interferometer Space Antenna} (LISA), are expected to produced rich data streams containing the signals from many millions of sources. Determining the number of sources that are resolvable, and the most appropriate description of each source poses a challenging model selection problem that may best be addressed in a Bayesian framework. An important class of LISA sources are the millions of low-mass binary systems within our own galaxy, tens of thousands of which will be detectable. Not only are the number of sources unknown, but so are the number of parameters required to model the waveforms. For example, a significant subset of the resolvable galactic binaries will exhibit orbital frequency evolution, while a smaller number will have measurable eccentricity. In the Bayesian approach to model selection one needs to compute the Bayes factor between competing models. Here we explore various methods for computing Bayes factors in the context of determining which galactic binaries have measurable frequency evolution. The methods explored include a Reverse Jump Markov Chain Monte Carlo (RJMCMC) algorithm, Savage-Dickie density ratios, the Schwarz-Bayes Information Criterion (BIC), and the Laplace approximation to the model evidence. We find good agreement between all of the approaches.Comment: 11 pages, 6 figure

    A Solution to the Galactic Foreground Problem for LISA

    Full text link
    Low frequency gravitational wave detectors, such as the Laser Interferometer Space Antenna (LISA), will have to contend with large foregrounds produced by millions of compact galactic binaries in our galaxy. While these galactic signals are interesting in their own right, the unresolved component can obscure other sources. The science yield for the LISA mission can be improved if the brighter and more isolated foreground sources can be identified and regressed from the data. Since the signals overlap with one another we are faced with a ``cocktail party'' problem of picking out individual conversations in a crowded room. Here we present and implement an end-to-end solution to the galactic foreground problem that is able to resolve tens of thousands of sources from across the LISA band. Our algorithm employs a variant of the Markov Chain Monte Carlo (MCMC) method, which we call the Blocked Annealed Metropolis-Hastings (BAM) algorithm. Following a description of the algorithm and its implementation, we give several examples ranging from searches for a single source to searches for hundreds of overlapping sources. Our examples include data sets from the first round of Mock LISA Data Challenges.Comment: 19 pages, 27 figure

    Nonparametric Reconstruction of the Dark Energy Equation of State from Diverse Data Sets

    Full text link
    The cause of the accelerated expansion of the Universe poses one of the most fundamental questions in physics today. In the absence of a compelling theory to explain the observations, a first task is to develop a robust phenomenology. If the acceleration is driven by some form of dark energy, then, the phenomenology is determined by the dark energy equation of state w. A major aim of ongoing and upcoming cosmological surveys is to measure w and its time dependence at high accuracy. Since w(z) is not directly accessible to measurement, powerful reconstruction methods are needed to extract it reliably from observations. We have recently introduced a new reconstruction method for w(z) based on Gaussian process modeling. This method can capture nontrivial time-dependences in w(z) and, most importantly, it yields controlled and unbaised error estimates. In this paper we extend the method to include a diverse set of measurements: baryon acoustic oscillations, cosmic microwave background measurements, and supernova data. We analyze currently available data sets and present the resulting constraints on w(z), finding that current observations are in very good agreement with a cosmological constant. In addition we explore how well our method captures nontrivial behavior of w(z) by analyzing simulated data assuming high-quality observations from future surveys. We find that the baryon acoustic oscillation measurements by themselves already lead to remarkably good reconstruction results and that the combination of different high-quality probes allows us to reconstruct w(z) very reliably with small error bounds.Comment: 14 pages, 9 figures, 3 table

    Nonparametric Reconstruction of the Dark Energy Equation of State

    Full text link
    A basic aim of ongoing and upcoming cosmological surveys is to unravel the mystery of dark energy. In the absence of a compelling theory to test, a natural approach is to better characterize the properties of dark energy in search of clues that can lead to a more fundamental understanding. One way to view this characterization is the improved determination of the redshift-dependence of the dark energy equation of state parameter, w(z). To do this requires a robust and bias-free method for reconstructing w(z) from data that does not rely on restrictive expansion schemes or assumed functional forms for w(z). We present a new nonparametric reconstruction method that solves for w(z) as a statistical inverse problem, based on a Gaussian Process representation. This method reliably captures nontrivial behavior of w(z) and provides controlled error bounds. We demonstrate the power of the method on different sets of simulated supernova data; the approach can be easily extended to include diverse cosmological probes.Comment: 16 pages, 11 figures, accepted for publication in Physical Review

    A Bayesian Approach to the Detection Problem in Gravitational Wave Astronomy

    Full text link
    The analysis of data from gravitational wave detectors can be divided into three phases: search, characterization, and evaluation. The evaluation of the detection - determining whether a candidate event is astrophysical in origin or some artifact created by instrument noise - is a crucial step in the analysis. The on-going analyses of data from ground based detectors employ a frequentist approach to the detection problem. A detection statistic is chosen, for which background levels and detection efficiencies are estimated from Monte Carlo studies. This approach frames the detection problem in terms of an infinite collection of trials, with the actual measurement corresponding to some realization of this hypothetical set. Here we explore an alternative, Bayesian approach to the detection problem, that considers prior information and the actual data in hand. Our particular focus is on the computational techniques used to implement the Bayesian analysis. We find that the Parallel Tempered Markov Chain Monte Carlo (PTMCMC) algorithm is able to address all three phases of the anaylsis in a coherent framework. The signals are found by locating the posterior modes, the model parameters are characterized by mapping out the joint posterior distribution, and finally, the model evidence is computed by thermodynamic integration. As a demonstration, we consider the detection problem of selecting between models describing the data as instrument noise, or instrument noise plus the signal from a single compact galactic binary. The evidence ratios, or Bayes factors, computed by the PTMCMC algorithm are found to be in close agreement with those computed using a Reversible Jump Markov Chain Monte Carlo algorithm.Comment: 19 pages, 12 figures, revised to address referee's comment

    Count time series prediction using particle filters

    Get PDF
    Non-Gaussian dynamic models are proposed to analyse time series of counts. Three models are proposed for responses generated by a Poisson, a negative binomial and a mixture of Poisson distributions. The parameters of these distributions are allowed to vary dynamically according to state space models. Particle filters or sequential Monte Carlo methods are used for inference and forecasting purposes. The performance of the proposed methodology is evaluated by two simulation studies for the Poisson and the negative binomial models. The methodology is illustrated by considering data consisting of medical contacts of schoolchildren suffering from asthma in England

    LISA Data Analysis using MCMC methods

    Full text link
    The Laser Interferometer Space Antenna (LISA) is expected to simultaneously detect many thousands of low frequency gravitational wave signals. This presents a data analysis challenge that is very different to the one encountered in ground based gravitational wave astronomy. LISA data analysis requires the identification of individual signals from a data stream containing an unknown number of overlapping signals. Because of the signal overlaps, a global fit to all the signals has to be performed in order to avoid biasing the solution. However, performing such a global fit requires the exploration of an enormous parameter space with a dimension upwards of 50,000. Markov Chain Monte Carlo (MCMC) methods offer a very promising solution to the LISA data analysis problem. MCMC algorithms are able to efficiently explore large parameter spaces, simultaneously providing parameter estimates, error analyses and even model selection. Here we present the first application of MCMC methods to simulated LISA data and demonstrate the great potential of the MCMC approach. Our implementation uses a generalized F-statistic to evaluate the likelihoods, and simulated annealing to speed convergence of the Markov chains. As a final step we super-cool the chains to extract maximum likelihood estimates, and estimates of the Bayes factors for competing models. We find that the MCMC approach is able to correctly identify the number of signals present, extract the source parameters, and return error estimates consistent with Fisher information matrix predictions.Comment: 14 pages, 7 figure

    Company family, innovation and colombian graphic industry: a bayesian estimation of a logistical model

    Get PDF
    This study presents a comparative analysis of the management of innovation among family and non-family companies of the Graphic Communication Industry in Colombia. For which a questionnaire was applied in order to know the divergences in the innovation process carried out by these two types of organizations. From this, the methodology of Generalized Linear Models (MLG) was used and the Bayesian inference was used on the parameters of the model, analyzing the effect of the family business, the products that commercialize on the management of innovation in goods observed as a product tangible Obtaining in this way, the identification of some characteristics of innovation management and divergences with non-family companies, among them: a tendency towards the type of preferred innovation, the different sources and objectives to innovate, and the factors that hinder its process of innovation
    corecore