48,977 research outputs found

    Networks and the epidemiology of infectious disease

    Get PDF
    The science of networks has revolutionised research into the dynamics of interacting elements. It could be argued that epidemiology in particular has embraced the potential of network theory more than any other discipline. Here we review the growing body of research concerning the spread of infectious diseases on networks, focusing on the interplay between network theory and epidemiology. The review is split into four main sections, which examine: the types of network relevant to epidemiology; the multitude of ways these networks can be characterised; the statistical methods that can be applied to infer the epidemiological parameters on a realised network; and finally simulation and analytical methods to determine epidemic dynamics on a given network. Given the breadth of areas covered and the ever-expanding number of publications, a comprehensive review of all work is impossible. Instead, we provide a personalised overview into the areas of network epidemiology that have seen the greatest progress in recent years or have the greatest potential to provide novel insights. As such, considerable importance is placed on analytical approaches and statistical methods which are both rapidly expanding fields. Throughout this review we restrict our attention to epidemiological issues

    Statistical Inference in Micro Simulation Models: Incorporating external information

    Get PDF
    In practical applications of micro simulation models very little is usually known about the properties of the simulated values. This paper argues that we need to apply the same rigorous standards for inference in micro simulation work as in scientific work generally. If not, then micro simulation models will loose in credibility. The paper first discusses how the structure of the model will determine inference and then follow sections on estimation and validation. Differences between inference in static and dynamic models are noted and then the paper focuses on the estimation of behavioral parameters. There are three themes: calibration viewed as estimation subject to external constraints, piece meal vs. system-wide estimation, and simulation based estimation.Micro simulation; Alignment; Calibration; System-wide estimation; Simulation-based estimation

    A practical illustration of the importance of realistic individualized treatment rules in causal inference

    Full text link
    The effect of vigorous physical activity on mortality in the elderly is difficult to estimate using conventional approaches to causal inference that define this effect by comparing the mortality risks corresponding to hypothetical scenarios in which all subjects in the target population engage in a given level of vigorous physical activity. A causal effect defined on the basis of such a static treatment intervention can only be identified from observed data if all subjects in the target population have a positive probability of selecting each of the candidate treatment options, an assumption that is highly unrealistic in this case since subjects with serious health problems will not be able to engage in higher levels of vigorous physical activity. This problem can be addressed by focusing instead on causal effects that are defined on the basis of realistic individualized treatment rules and intention-to-treat rules that explicitly take into account the set of treatment options that are available to each subject. We present a data analysis to illustrate that estimators of static causal effects in fact tend to overestimate the beneficial impact of high levels of vigorous physical activity while corresponding estimators based on realistic individualized treatment rules and intention-to-treat rules can yield unbiased estimates. We emphasize that the problems encountered in estimating static causal effects are not restricted to the IPTW estimator, but are also observed with the GG-computation estimator, the DR-IPTW estimator, and the targeted MLE. Our analyses based on realistic individualized treatment rules and intention-to-treat rules suggest that high levels of vigorous physical activity may confer reductions in mortality risk on the order of 15-30%, although in most cases the evidence for such an effect does not quite reach the 0.05 level of significance.Comment: Published in at http://dx.doi.org/10.1214/07-EJS105 the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Dynamic filtering of static dipoles in magnetoencephalography

    Get PDF
    We consider the problem of estimating neural activity from measurements of the magnetic fields recorded by magnetoencephalography. We exploit the temporal structure of the problem and model the neural current as a collection of evolving current dipoles, which appear and disappear, but whose locations are constant throughout their lifetime. This fully reflects the physiological interpretation of the model. In order to conduct inference under this proposed model, it was necessary to develop an algorithm based around state-of-the-art sequential Monte Carlo methods employing carefully designed importance distributions. Previous work employed a bootstrap filter and an artificial dynamic structure where dipoles performed a random walk in space, yielding nonphysical artefacts in the reconstructions; such artefacts are not observed when using the proposed model. The algorithm is validated with simulated data, in which it provided an average localisation error which is approximately half that of the bootstrap filter. An application to complex real data derived from a somatosensory experiment is presented. Assessment of model fit via marginal likelihood showed a clear preference for the proposed model and the associated reconstructions show better localisation

    Ecological non-linear state space model selection via adaptive particle Markov chain Monte Carlo (AdPMCMC)

    Full text link
    We develop a novel advanced Particle Markov chain Monte Carlo algorithm that is capable of sampling from the posterior distribution of non-linear state space models for both the unobserved latent states and the unknown model parameters. We apply this novel methodology to five population growth models, including models with strong and weak Allee effects, and test if it can efficiently sample from the complex likelihood surface that is often associated with these models. Utilising real and also synthetically generated data sets we examine the extent to which observation noise and process error may frustrate efforts to choose between these models. Our novel algorithm involves an Adaptive Metropolis proposal combined with an SIR Particle MCMC algorithm (AdPMCMC). We show that the AdPMCMC algorithm samples complex, high-dimensional spaces efficiently, and is therefore superior to standard Gibbs or Metropolis Hastings algorithms that are known to converge very slowly when applied to the non-linear state space ecological models considered in this paper. Additionally, we show how the AdPMCMC algorithm can be used to recursively estimate the Bayesian Cram\'er-Rao Lower Bound of Tichavsk\'y (1998). We derive expressions for these Cram\'er-Rao Bounds and estimate them for the models considered. Our results demonstrate a number of important features of common population growth models, most notably their multi-modal posterior surfaces and dependence between the static and dynamic parameters. We conclude by sampling from the posterior distribution of each of the models, and use Bayes factors to highlight how observation noise significantly diminishes our ability to select among some of the models, particularly those that are designed to reproduce an Allee effect

    Efficient Sequential Monte-Carlo Samplers for Bayesian Inference

    Full text link
    In many problems, complex non-Gaussian and/or nonlinear models are required to accurately describe a physical system of interest. In such cases, Monte Carlo algorithms are remarkably flexible and extremely powerful approaches to solve such inference problems. However, in the presence of a high-dimensional and/or multimodal posterior distribution, it is widely documented that standard Monte-Carlo techniques could lead to poor performance. In this paper, the study is focused on a Sequential Monte-Carlo (SMC) sampler framework, a more robust and efficient Monte Carlo algorithm. Although this approach presents many advantages over traditional Monte-Carlo methods, the potential of this emergent technique is however largely underexploited in signal processing. In this work, we aim at proposing some novel strategies that will improve the efficiency and facilitate practical implementation of the SMC sampler specifically for signal processing applications. Firstly, we propose an automatic and adaptive strategy that selects the sequence of distributions within the SMC sampler that minimizes the asymptotic variance of the estimator of the posterior normalization constant. This is critical for performing model selection in modelling applications in Bayesian signal processing. The second original contribution we present improves the global efficiency of the SMC sampler by introducing a novel correction mechanism that allows the use of the particles generated through all the iterations of the algorithm (instead of only particles from the last iteration). This is a significant contribution as it removes the need to discard a large portion of the samples obtained, as is standard in standard SMC methods. This will improve estimation performance in practical settings where computational budget is important to consider.Comment: arXiv admin note: text overlap with arXiv:1303.3123 by other author

    Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation

    Get PDF
    We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.Comment: Forthcoming on the Proceedings of ICML 2019. New comparisons with several different networks. We now use the Wasserstein distance to produce comparisons. Code available on GitHub. 16 pages, 5 figures, 21 table
    corecore