92,735 research outputs found

    The chopthin algorithm for resampling

    Full text link
    Resampling is a standard step in particle filters and more generally sequential Monte Carlo methods. We present an algorithm, called chopthin, for resampling weighted particles. In contrast to standard resampling methods the algorithm does not produce a set of equally weighted particles; instead it merely enforces an upper bound on the ratio between the weights. Simulation studies show that the chopthin algorithm consistently outperforms standard resampling methods. The algorithms chops up particles with large weight and thins out particles with low weight, hence its name. It implicitly guarantees a lower bound on the effective sample size. The algorithm can be implemented efficiently, making it practically useful. We show that the expected computational effort is linear in the number of particles. Implementations for C++, R (on CRAN), Python and Matlab are available.Comment: 14 pages, 4 figure

    Comparison of Resampling Schemes for Particle Filtering

    Full text link
    This contribution is devoted to the comparison of various resampling approaches that have been proposed in the literature on particle filtering. It is first shown using simple arguments that the so-called residual and stratified methods do yield an improvement over the basic multinomial resampling approach. A simple counter-example showing that this property does not hold true for systematic resampling is given. Finally, some results on the large-sample behavior of the simple bootstrap filter algorithm are given. In particular, a central limit theorem is established for the case where resampling is performed using the residual approach

    Boosting Image Forgery Detection using Resampling Features and Copy-move analysis

    Full text link
    Realistic image forgeries involve a combination of splicing, resampling, cloning, region removal and other methods. While resampling detection algorithms are effective in detecting splicing and resampling, copy-move detection algorithms excel in detecting cloning and region removal. In this paper, we combine these complementary approaches in a way that boosts the overall accuracy of image manipulation detection. We use the copy-move detection method as a pre-filtering step and pass those images that are classified as untampered to a deep learning based resampling detection framework. Experimental results on various datasets including the 2017 NIST Nimble Challenge Evaluation dataset comprising nearly 10,000 pristine and tampered images shows that there is a consistent increase of 8%-10% in detection rates, when copy-move algorithm is combined with different resampling detection algorithms

    Semi-independent resampling for particle filtering

    Full text link
    Among Sequential Monte Carlo (SMC) methods,Sampling Importance Resampling (SIR) algorithms are based on Importance Sampling (IS) and on some resampling-based)rejuvenation algorithm which aims at fighting against weight degeneracy. However %whichever the resampling technique used this mechanism tends to be insufficient when applied to informative or high-dimensional models. In this paper we revisit the rejuvenation mechanism and propose a class of parameterized SIR-based solutions which enable to adjust the tradeoff between computational cost and statistical performances

    Negative association, ordering and convergence of resampling methods

    Get PDF
    We study convergence and convergence rates for resampling schemes. Our first main result is a general consistency theorem based on the notion of negative association, which is applied to establish the almost-sure weak convergence of measures output from Kitagawa's (1996) stratified resampling method. Carpenter et al's (1999) systematic resampling method is similar in structure but can fail to converge depending on the order of the input samples. We introduce a new resampling algorithm based on a stochastic rounding technique of Srinivasan (2001), which shares some attractive properties of systematic resampling, but which exhibits negative association and therefore converges irrespective of the order of the input samples. We confirm a conjecture made by Kitagawa (1996) that ordering input samples by their states in R\mathbb{R} yields a faster rate of convergence; we establish that when particles are ordered using the Hilbert curve in Rd\mathbb{R}^d, the variance of the resampling error is O(N−(1+1/d)){\scriptscriptstyle\mathcal{O}}(N^{-(1+1/d)}) under mild conditions, where NN is the number of particles. We use these results to establish asymptotic properties of particle algorithms based on resampling schemes that differ from multinomial resampling.Comment: 54 pages, including 30 pages of supplementary materials (a typo in Algorithm 1 has been corrected
    • 

    corecore