13,390 research outputs found

    Importance Sampling: Intrinsic Dimension and Computational Cost

    Get PDF
    The basic idea of importance sampling is to use independent samples from a proposal measure in order to approximate expectations with respect to a target measure. It is key to understand how many samples are required in order to guarantee accurate approximations. Intuitively, some notion of distance between the target and the proposal should determine the computational cost of the method. A major challenge is to quantify this distance in terms of parameters or statistics that are pertinent for the practitioner. The subject has attracted substantial interest from within a variety of communities. The objective of this paper is to overview and unify the resulting literature by creating an overarching framework. A general theory is presented, with a focus on the use of importance sampling in Bayesian inverse problems and filtering.Comment: Statistical Scienc

    Uniform Time Average Consistency of Monte Carlo Particle Filters

    Get PDF
    We prove that bootstrap type Monte Carlo particle filters approximate the optimal nonlinear filter in a time average sense uniformly with respect to the time horizon when the signal is ergodic and the particle system satisfies a tightness property. The latter is satisfied without further assumptions when the signal state space is compact, as well as in the noncompact setting when the signal is geometrically ergodic and the observations satisfy additional regularity assumptions.Comment: 21 pages, 1 figur

    Stochastic filtering via L2 projection on mixture manifolds with computer algorithms and numerical examples

    Get PDF
    We examine some differential geometric approaches to finding approximate solutions to the continuous time nonlinear filtering problem. Our primary focus is a new projection method for the optimal filter infinite dimensional Stochastic Partial Differential Equation (SPDE), based on the direct L2 metric and on a family of normal mixtures. We compare this method to earlier projection methods based on the Hellinger distance/Fisher metric and exponential families, and we compare the L2 mixture projection filter with a particle method with the same number of parameters, using the Levy metric. We prove that for a simple choice of the mixture manifold the L2 mixture projection filter coincides with a Galerkin method, whereas for more general mixture manifolds the equivalence does not hold and the L2 mixture filter is more general. We study particular systems that may illustrate the advantages of this new filter over other algorithms when comparing outputs with the optimal filter. We finally consider a specific software design that is suited for a numerically efficient implementation of this filter and provide numerical examples.Comment: Updated and expanded version published in the Journal reference below. Preprint updates: January 2016 (v3) added projection of Zakai Equation and difference with projection of Kushner-Stratonovich (section 4.1). August 2014 (v2) added Galerkin equivalence proof (Section 5) to the March 2013 (v1) versio

    Inverse Problems and Data Assimilation

    Full text link
    These notes are designed with the aim of providing a clear and concise introduction to the subjects of Inverse Problems and Data Assimilation, and their inter-relations, together with citations to some relevant literature in this area. The first half of the notes is dedicated to studying the Bayesian framework for inverse problems. Techniques such as importance sampling and Markov Chain Monte Carlo (MCMC) methods are introduced; these methods have the desirable property that in the limit of an infinite number of samples they reproduce the full posterior distribution. Since it is often computationally intensive to implement these methods, especially in high dimensional problems, approximate techniques such as approximating the posterior by a Dirac or a Gaussian distribution are discussed. The second half of the notes cover data assimilation. This refers to a particular class of inverse problems in which the unknown parameter is the initial condition of a dynamical system, and in the stochastic dynamics case the subsequent states of the system, and the data comprises partial and noisy observations of that (possibly stochastic) dynamical system. We will also demonstrate that methods developed in data assimilation may be employed to study generic inverse problems, by introducing an artificial time to generate a sequence of probability measures interpolating from the prior to the posterior

    Rigorous free fermion entanglement renormalization from wavelet theory

    Get PDF
    We construct entanglement renormalization schemes which provably approximate the ground states of non-interacting fermion nearest-neighbor hopping Hamiltonians on the one-dimensional discrete line and the two-dimensional square lattice. These schemes give hierarchical quantum circuits which build up the states from unentangled degrees of freedom. The circuits are based on pairs of discrete wavelet transforms which are approximately related by a "half-shift": translation by half a unit cell. The presence of the Fermi surface in the two-dimensional model requires a special kind of circuit architecture to properly capture the entanglement in the ground state. We show how the error in the approximation can be controlled without ever performing a variational optimization.Comment: 15 pages, 10 figures, one theore

    On Quantum Statistical Inference, I

    Full text link
    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics. Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various possible types of measurements. This scenery is outlined (with an audience of statisticians and probabilists in mind).Comment: A shorter version containing some different material will appear (2003), with discussion, in J. Roy. Statist. Soc. B, and is archived as quant-ph/030719
    corecore