104,753 research outputs found

    A Particle Swarm Optimization inspired tracker applied to visual tracking

    Get PDF
    International audienceVisual tracking is dynamic optimization where time and object state simultaneously influence the problem. In this paper, we intend to show that we built a tracker from an evolutionary optimization approach, the PSO (Particle Swarm optimization) algorithm. We demonstrated that an extension of the original algorithm where system dynamics is explicitly taken into consideration, it can perform an efficient tracking. This tracker is also shown to outperform SIR (Sampling Importance Resampling) algorithm with random walk and constant velocity model, as well as a previously PSO inspired tracker, SPSO (Sequential Particle Swarm Optimization). Experiments were performed both on simulated data and real visual RGB-D information. Our PSO inspired tracker can be a very effective and robust alternative for visual tracking

    Optimal Parallel Randomized Algorithms for the Voronoi Diagram of Line Segments in the Plane and Related Problems

    Get PDF
    In this paper, we present an optimal parallel randomized algorithm for the Voronoi diagram of a set of n non-intersecting (except possibly at endpoints) line segments in the plane. Our algorithm runs in O(log n) time with very high probability and uses O(n) processors on a CRCW PRAM. This algorithm is optimal in terms of P.T bounds since the sequential time bound for this problem is Ω(n log n). Our algorithm improves by an O(log n) factor the previously best known deterministic parallel algorithm which runs in O(log2 n) time using O(n) processors [13]. We obtain this result by using random sampling at two stages of our algorithm and using efficient randomized search techniques. This technique gives a direct optimal algorithm for the Voronoi diagram of points as well (all other optimal parallel algorithms for this problem use reduction from the 3-d convex hull construction)

    Parallel Weighted Random Sampling

    Get PDF
    Data structures for efficient sampling from a set of weighted items are an important building block of many applications. However, few parallel solutions are known. We close many of these gaps both for shared-memory and distributed-memory machines. We give efficient, fast, and practicable algorithms for sampling single items, k items with/without replacement, permutations, subsets, and reservoirs. We also give improved sequential algorithms for alias table construction and for sampling with replacement. Experiments on shared-memory parallel machines with up to 158 threads show near linear speedups both for construction and queries

    On Scalable Particle Markov Chain Monte Carlo

    Full text link
    Particle Markov Chain Monte Carlo (PMCMC) is a general approach to carry out Bayesian inference in non-linear and non-Gaussian state space models. Our article shows how to scale up PMCMC in terms of the number of observations and parameters by expressing the target density of the PMCMC in terms of the basic uniform or standard normal random numbers, instead of the particles, used in the sequential Monte Carlo algorithm. Parameters that can be drawn efficiently conditional on the particles are generated by particle Gibbs. All the other parameters are drawn by conditioning on the basic uniform or standard normal random variables; e.g. parameters that are highly correlated with the states, or parameters whose generation is expensive when conditioning on the states. The performance of this hybrid sampler is investigated empirically by applying it to univariate and multivariate stochastic volatility models having both a large number of parameters and a large number of latent states and shows that it is much more efficient than competing PMCMC methods. We also show that the proposed hybrid sampler is ergodic

    Particle Efficient Importance Sampling

    Full text link
    The efficient importance sampling (EIS) method is a general principle for the numerical evaluation of high-dimensional integrals that uses the sequential structure of target integrands to build variance minimising importance samplers. Despite a number of successful applications in high dimensions, it is well known that importance sampling strategies are subject to an exponential growth in variance as the dimension of the integration increases. We solve this problem by recognising that the EIS framework has an offline sequential Monte Carlo interpretation. The particle EIS method is based on non-standard resampling weights that take into account the look-ahead construction of the importance sampler. We apply the method for a range of univariate and bivariate stochastic volatility specifications. We also develop a new application of the EIS approach to state space models with Student's t state innovations. Our results show that the particle EIS method strongly outperforms both the standard EIS method and particle filters for likelihood evaluation in high dimensions. Moreover, the ratio between the variances of the particle EIS and particle filter methods remains stable as the time series dimension increases. We illustrate the efficiency of the method for Bayesian inference using the particle marginal Metropolis-Hastings and importance sampling squared algorithms
    • …
    corecore