306 research outputs found

    Performance of Sampling/Resampling-based Particle Filters Applied to Non-Linear Problems

    Get PDF
    In this work, we propose a wireless body area sensor network (WBASN) to monitor patient position. Localization and tracking are enhanced by improving the effect of the received signal strength (RSS) variation. First, we propose a modified particle filter (PF) that adjusts resampling parameters for the Kullback-Leibler distance (KLD)-resampling algorithm to ameliorate the effect of RSS variation by generating a sample set near the high-likelihood region. The key issue of this method is to use a resampling parameter lower bound for reducing both the root mean square error (RMSE) and the mean number of particles used. To determine this lower bound, an optimal algorithm is proposed based on the maximum RMSE between the proposed algorithm and the KLD-resampling algorithm or based on the maximum mean number of particles used of these algorithms. Finally, PFs based on KLD-sampling and KLD-resampling are proposed to minimize the efficient number of particles and to reduce the estimation error compared to traditional algorithms

    Data assimilation: the Schrödinger perspective

    Get PDF
    Data assimilation addresses the general problem of how to combine model-based predictions with partial and noisy observations of the process in an optimal manner. This survey focuses on sequential data assimilation techniques using probabilistic particle-based algorithms. In addition to surveying recent developments for discrete- and continuous-time data assimilation, both in terms of mathematical foundations and algorithmic implementations, we also provide a unifying framework from the perspective of coupling of measures, and Schrödinger’s boundary value problem for stochastic processes in particular

    Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

    Get PDF

    Kullback-Leibler divergence-based differential eEvolution Markov chain filter for global localization of mobile robots

    Get PDF
    One of the most important skills desired for a mobile robot is the ability to obtain its own location even in challenging environments. The information provided by the sensing system is used here to solve the global localization problem. In our previous work, we designed different algorithms founded on evolutionary strategies in order to solve the aforementioned task. The latest developments are presented in this paper. The engine of the localization module is a combination of the Markov chain Monte Carlo sampling technique and the Differential Evolution method, which results in a particle filter based on the minimization of a fitness function. The robot's pose is estimated from a set of possible locations weighted by a cost value. The measurements of the perceptive sensors are used together with the predicted ones in a known map to define a cost function to optimize. Although most localization methods rely on quadratic fitness functions, the sensed information is processed asymmetrically in this filter. The Kullback-Leibler divergence is the basis of a cost function that makes it possible to deal with different types of occlusions. The algorithm performance has been checked in a real map. The results are excellent in environments with dynamic and unmodeled obstacles, a fact that causes occlusions in the sensing area.The research leading to these results has received funding from the RoboCity2030-III-CM project (RobĂłtica aplicada a la mejora de la calidad de vida de los ciudadanos, fase III; S2013/MIT-2748),funded by Programas de Actividades I+Den la Comunidad de Madrid and cofunded by the Structural Funds of the EU

    Beyond Gaussian Statistical Modeling in Geophysical Data Assimilation

    Get PDF
    International audienceThis review discusses recent advances in geophysical data assimilation beyond Gaussian statistical modeling, in the fields of meteorology, oceanography, as well as atmospheric chemistry. The non-Gaussian features are stressed rather than the nonlinearity of the dynamical models, although both aspects are entangled. Ideas recently proposed to deal with these non-Gaussian issues, in order to improve the state or parameter estimation, are emphasized. The general Bayesian solution to the estimation problem and the techniques to solve it are first presented, as well as the obstacles that hinder their use in high-dimensional and complex systems. Approximations to the Bayesian solution relying on Gaussian, or on second-order moment closure, have been wholly adopted in geophysical data assimilation (e.g., Kalman filters and quadratic variational solutions). Yet, nonlinear and non-Gaussian effects remain. They essentially originate in the nonlinear models and in the non-Gaussian priors. How these effects are handled within algorithms based on Gaussian assumptions is then described. Statistical tools that can diagnose them and measure deviations from Gaussianity are recalled. The following advanced techniques that seek to handle the estimation problem beyond Gaussianity are reviewed: maximum entropy filter, Gaussian anamorphosis, non-Gaussian priors, particle filter with an ensemble Kalman filter as a proposal distribution, maximum entropy on the mean, or strictly Bayesian inferences for large linear models, etc. Several ideas are illustrated with recent or original examples that possess some features of high-dimensional systems. Many of the new approaches are well understood only in special cases and have difficulties that remain to be circumvented. Some of the suggested approaches are quite promising, and sometimes already successful for moderately large though specific geophysical applications. Hints are given as to where progress might come from

    On optimality of kernels for approximate Bayesian computation using sequential Monte Carlo

    Get PDF
    Approximate Bayesian computation (ABC) has gained popularity over the past few years for the analysis of complex models arising in population genetics, epidemiology and system biology. Sequential Monte Carlo (SMC) approaches have become work-horses in ABC. Here we discuss how to construct the perturbation kernels that are required in ABC SMC approaches, in order to construct a sequence of distributions that start out from a suitably defined prior and converge towards the unknown posterior. We derive optimality criteria for different kernels, which are based on the Kullback-Leibler divergence between a distribution and the distribution of the perturbed particles. We will show that for many complicated posterior distributions, locally adapted kernels tend to show the best performance. We find that the added moderate cost of adapting kernel functions is easily regained in terms of the higher acceptance rate. We demonstrate the computational efficiency gains in a range of toy examples which illustrate some of the challenges faced in real-world applications of ABC, before turning to two demanding parameter inference problems in molecular biology, which highlight the huge increases in efficiency that can be gained from choice of optimal kernels. We conclude with a general discussion of the rational choice of perturbation kernels in ABC SMC settings

    World Modeling for Intelligent Autonomous Systems

    Get PDF
    The functioning of intelligent autonomous systems requires constant situation awareness and cognition analysis. Thus, it needs a memory structure that contains a description of the surrounding environment (world model) and serves as a central information hub. This book presents a row of theoretical and experimental results in the field of world modeling. This includes areas of dynamic and prior knowledge modeling, information fusion, management and qualitative/quantitative information analysis

    A population Monte Carlo approach to estimating parametric bidirectional reflectance distribution functions through Markov random field parameter estimation

    Get PDF
    In this thesis, we propose a method for estimating the parameters of a parametric bidirectional reflectance distribution function (BRDF) for an object surface. The method uses a novel Markov Random Field (MRF) formulation on triplets of corner vertex nodes to model the probability of sets of reflectance parameters for arbitrary reflectance models, given probabilistic surface geometry, camera, illumination, and reflectance image information. In this way, the BRDF parameter estimation problem is cast as a MRF parameter estimation problem. We also present a novel method for estimating the MRF parameters, which uses Population Monte Carlo (PMC) sampling to yield a posterior distribution over the parameters of the BRDF. This PMC based method for estimating the posterior distribution on MRF parameters is compared, using synthetic data, to other parameter estimation methods based on Markov Chain Monte Carlo (MCMC) and Levenberg-Marquardt nonlinear minimization, where it is found to have better results for convergence to the known correct synthetic data parameter sets than the MCMC based methods, and similar convergence results to the LM method. The posterior distributions on the parametric BRDFs for real surfaces, which are represented as evolved sample sets calculated using a Population Monte Carlo algorithm, can be used as features in other high-level vision material or surface classification methods. A variety of probabilistic distances between these features, including the Kullback-Leibler divergence, the Bhattacharyya distance and the Patrick-Fisher distance is used to test the classifiability of the materials, using the PMC evolved sample sets as features. In our experiments on real data, which comprises 48 material surfaces belonging to 12 classes of material, classification errors are counted by comparing the 1-nearest-neighbour classification results to the known (manually specified) material classes. Other classification error statistics such as WNN (worst nearest neighbour) are also calculated. The symmetric Kullback-Leibler divergence, used as a distance measure between the PMC developed sample sets, is the distance measure which gives the best classification results on the real data, when using the 1-nearest neighbour classification method. It is also found that the sets of samples representing the posterior distributions over the MRF parameter spaces are better features for material surface classification than the optimal MRF parameters returned by multiple-seed Levenberg-Marquardt minimization algorithms, which are configured to find the same MRF parameters. The classifiability of the materials is also better when using the entire evolved sample sets (calculated by PMC) as classification features than it is when using only the maximum a-posteriori sample from the PMC evolved sample sets as the feature for each material. It is therefore possible to calculate usable parametric BRDF features for surface classification, using our method
    • 

    corecore