102 research outputs found

    Bayesian optimization using sequential Monte Carlo

    Full text link
    We consider the problem of optimizing a real-valued continuous function ff using a Bayesian approach, where the evaluations of ff are chosen sequentially by combining prior information about ff, which is described by a random process model, and past evaluation results. The main difficulty with this approach is to be able to compute the posterior distributions of quantities of interest which are used to choose evaluation points. In this article, we decide to use a Sequential Monte Carlo (SMC) approach

    amei: An R Package for the Adaptive Management of Epidemiological Interventions

    Get PDF
    The amei package for R is a tool that provides a flexible statistical framework for generating optimal epidemiological interventions that are designed to minimize the total expected cost of an emerging epidemic. Uncertainty regarding the underlying disease parameters is propagated through to the decision process via Bayesian posterior inference. The strategies produced through this framework are adaptive: vaccination schedules are iteratively adjusted to reflect the anticipated trajectory of the epidemic given the current population state and updated parameter estimates. This document briefly covers the background and methodology underpinning the implementation provided by the package and contains extensive examples showing the functions and methods in action.

    amei: An R Package for the Adaptive Management of Epidemiological Interventions

    Get PDF
    The <b>amei</b> package for <b>R</b> is a tool that provides a flexible statistical framework for generating optimal epidemiological interventions that are designed to minimize the total expected cost of an emerging epidemic. Uncertainty regarding the underlying disease parameters is propagated through to the decision process via Bayesian posterior inference. The strategies produced through this framework are adaptive: vaccination schedules are iteratively adjusted to reflect the anticipated trajectory of the epidemic given the current population state and updated parameter estimates. This document briefly covers the background and methodology underpinning the implementation provided by the package and contains extensive examples showing the functions and methods in action

    Efficient Bayesian inference for natural time series using ARFIMA processes

    Get PDF
    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. In this paper we present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators. For CET we also extend our method to seasonal long memory

    A Statistical Framework for the Adaptive Management of Epidemiological Interventions

    Get PDF
    Background: Epidemiological interventions aim to control the spread of infectious disease through various mechanisms, each carrying a different associated cost. Methodology: We describe a flexible statistical framework for generating optimal epidemiological interventions that are designed to minimize the total expected cost of an emerging epidemic while simultaneously propagating uncertainty regarding the underlying disease model parameters through to the decision process. The strategies produced through this framework are adaptive: vaccination schedules are iteratively adjusted to reflect the anticipated trajectory of the epidemic given the current population state and updated parameter estimates. Conclusions: Using simulation studies based on a classic influenza outbreak, we demonstrate the advantages of adaptiv

    A population Monte Carlo scheme with transformed weights and its application to stochastic kinetic models

    Get PDF
    This paper addresses the problem of Monte Carlo approximation of posterior probability distributions. In particular, we have considered a recently proposed technique known as population Monte Carlo (PMC), which is based on an iterative importance sampling approach. An important drawback of this methodology is the degeneracy of the importance weights when the dimension of either the observations or the variables of interest is high. To alleviate this difficulty, we propose a novel method that performs a nonlinear transformation on the importance weights. This operation reduces the weight variation, hence it avoids their degeneracy and increases the efficiency of the importance sampling scheme, specially when drawing from a proposal functions which are poorly adapted to the true posterior. For the sake of illustration, we have applied the proposed algorithm to the estimation of the parameters of a Gaussian mixture model. This is a very simple problem that enables us to clearly show and discuss the main features of the proposed technique. As a practical application, we have also considered the popular (and challenging) problem of estimating the rate parameters of stochastic kinetic models (SKM). SKMs are highly multivariate systems that model molecular interactions in biological and chemical problems. We introduce a particularization of the proposed algorithm to SKMs and present numerical results.Comment: 35 pages, 8 figure

    Constraint Handling in Efficient Global Optimization

    Get PDF
    This is the author accepted manuscript. The final version is available from ACM via the DOI in this record.Real-world optimization problems are often subject to several constraints which are expensive to evaluate in terms of cost or time. Although a lot of effort is devoted to make use of surrogate models for expensive optimization tasks, not many strong surrogate-assisted algorithms can address the challenging constrained problems. Efficient Global Optimization (EGO) is a Kriging-based surrogate-assisted algorithm. It was originally proposed to address unconstrained problems and later was modified to solve constrained problems. However, these type of algorithms still suffer from several issues, mainly: (1) early stagnation, (2) problems with multiple active constraints and (3) frequent crashes. In this work, we introduce a new EGO-based algorithm which tries to overcome these common issues with Kriging optimization algorithms. We apply the proposed algorithm on problems with dimension d ≤ 4 from the G-function suite [16] and on an airfoil shape example.This research was partly funded by Tekes, the Finnish Funding Agency for Innovation (the DeCoMo project), and by the Engineering and Physical Sciences Research Council [grant numbers EP/N017195/1, EP/N017846/1]

    Application of Bayesian regression with singular value decomposition method in association studies for sequence data

    Get PDF
    Genetic association studies usually involve a large number of single-nucleotide polymorphisms (SNPs) (k) and a relative small sample size (n), which produces the situation that k is much greater than n. Because conventional statistical approaches are unable to deal with multiple SNPs simultaneously when k is much greater than n, single-SNP association studies have been used to identify genes involved in a disease’s pathophysiology, which causes a multiple testing problem. To evaluate the contribution of multiple SNPs simultaneously to disease traits when k is much greater than n, we developed the Bayesian regression with singular value decomposition (BRSVD) method. The method reduces the dimension of the design matrix from k to n by applying singular value decomposition to the design matrix. We evaluated the model using a Markov chain Monte Carlo simulation with Gibbs sampler constructed from the posterior densities driven by conjugate prior densities. Permutation was incorporated to generate empirical p-values. We applied the BRSVD method to the sequence data provided by Genetic Analysis Workshop 17 and found that the BRSVD method is a practical method that can be used to analyze sequence data in comparison to the single-SNP association test and the penalized regression method
    • …
    corecore