158 research outputs found

    Identification of multi-object dynamical systems : consistency and Fisher information

    Get PDF
    Learning the model parameters of a multiobject dynamical system from partial and perturbed observations is a challenging task. Despite recent numerical advancements in learning these parameters, theoretical guarantees are extremely scarce. In this article we aim to help fill this gap and study the identifiability of the model parameters and the consistency of the corresponding maximum likelihood estimate (MLE) under assumptions on the different components of the underlying multi-object system. In order to understand the impact of the various sources of observation noise on the ability to learn the model parameters, we study the asymptotic variance of the MLE through the associated Fisher information matrix. For example, we show that specific aspects of the multitarget tracking (MTT) problem such as detection failures and unknown data association lead to a loss of information which is quantified in special cases of interest. To the best of the authors' knowledge, these are new theoretically backed insights on the subtleties of MTT parameter learning

    On large lag smoothing for hidden Markov Models

    Get PDF
    In this article we consider the smoothing problem for hidden Markov models. Given a hidden Markov chain {Xn}n0\{X_n\}_{n\geq 0} and observations {Yn}n0\{Y_n\}_{n\geq 0}, our objective is to compute E[φ(X0,,Xk)y0,,yn]\mathbb{E}[\varphi(X_0,\dots,X_k)|y_{0},\dots,y_n] for some real-valued, integrable functional φ\varphi and kk fixed, knk \ll n and for some realization (y0,,yn)(y_0,\dots,y_n) of (Y0,,Yn)(Y_0,\dots,Y_n). We introduce a novel application of the multilevel Monte Carlo method with a coupling based on the Knothe--Rosenblatt rearrangement. We prove that this method can approximate the aforementioned quantity with a mean square error (MSE) of O(ϵ2)\mathcal{O}(\epsilon^2) for arbitrary ϵ>0\epsilon>0 with a cost of O(ϵ2)\mathcal{O}(\epsilon^{-2}). This is in contrast to the same direct Monte Carlo method, which requires a cost of O(nϵ2)\mathcal{O}(n\epsilon^{-2}) for the same MSE. The approach we suggest is, in general, not possible to implement, so the optimal transport methodology of [A. Spantini, D. Bigoni, and Y. Marzouk, J. Mach. Learn. Res., 19 (2018), pp. 2639--2709; M. Parno, T. Moselhy, and Y. Marzouk, SIAM/ASA J. Uncertain. Quantif., 4 (2016), pp. 1160--1190] is used, which directly approximates our strategy. We show that our theoretical improvements are achieved, even under approximation, in several numerical examples

    Inference for a Class of Partially Observed Point Process Models

    Full text link
    This paper presents a simulation-based framework for sequential inference from partially and discretely observed point process (PP's) models with static parameters. Taking on a Bayesian perspective for the static parameters, we build upon sequential Monte Carlo (SMC) methods, investigating the problems of performing sequential filtering and smoothing in complex examples, where current methods often fail. We consider various approaches for approximating posterior distributions using SMC. Our approaches, with some theoretical discussion are illustrated on a doubly stochastic point process applied in the context of finance

    Inference for a Class of Partially Observed Point Process Models

    Get PDF
    This paper presents a simulation-based framework for sequential inference from partially and discretely observed point process (PP's) models with static parameters. Taking on a Bayesian perspective for the static parameters, we build upon sequential Monte Carlo (SMC) methods, investigating the problems of performing sequential filtering and smoothing in complex examples, where current methods often fail. We consider various approaches for approximating posterior distributions using SMC. Our approaches, with some theoretical discussion are illustrated on a doubly stochastic point process applied in the context of finance

    Parameter Estimation for Hidden Markov models with Intractable Likelihoods

    Get PDF
    In this talk I consider sequential Monte Carlo (SMC) methods for hidden Markov models. In the scenario for which the conditional density of the observations given the latent state is intractable we give a simple ABC approximation of the model along with some basic SMC algorithms for sampling from the associated filtering distribution. Then, we consider the problem of smoothing, given access to a batch data set. We present a simulation technique which combines forward only smoothing (Del Moral et al, 2011) and particle Markov chain Monte Carlo (Andrieu et al 2010), for an algorithm which scales linearly in the number of particles
    corecore