6,785 research outputs found
Communication Optimizations for a Wireless Distributed Prognostic Framework
Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication overhead is an important design problem for such systems. In this paper we focus on communication issues faced in the distributed implementation of an important class of algorithms for prognostics - particle filters. In spite of being computation and memory intensive, particle filters lend well to distributed implementation except for one significant step - resampling. We propose new resampling scheme called parameterized resampling that attempts to reduce communication between collaborating nodes in a distributed wireless sensor network. Analysis and comparison with relevant resampling schemes is also presented. A battery health management system is used as a target application. A new resampling scheme for distributed implementation of particle filters has been discussed in this paper. Analysis and comparison of this new scheme with existing resampling schemes in the context for minimizing communication overhead have also been discussed. Our proposed new resampling scheme performs significantly better compared to other schemes by attempting to reduce both the communication message length as well as number total communication messages exchanged while not compromising prediction accuracy and precision. Future work will explore the effects of the new resampling scheme in the overall computational performance of the whole system as well as full implementation of the new schemes on the Sun SPOT devices. Exploring different network architectures for efficient communication is an importance future research direction as well
Adaptive memory-based single distribution resampling for particle filter
The restrictions that are related to using single distribution resampling for some specific computing devices’ memory gives developers several difficulties as a result of the increased effort and time needed for the development of a particle filter. Thus, one needs a new sequential resampling algorithm that is flexible enough to allow it to be used with various computing devices. Therefore, this paper formulated a new single distribution resampling called the adaptive memory size-based single distribution resampling (AMSSDR). This resampling method integrates traditional variation resampling and traditional resampling in one architecture. The algorithm changes the resampling algorithm using the memory in a computing device. This helps the developer formulate a particle filter without over considering the computing devices’ memory utilisation during the development of different particle filters. At the start of the operational process, it uses the AMSSDR selector to choose an appropriate resampling algorithm (for example, rounding copy resampling or systematic resampling), based on the current computing devices’ physical memory. If one chooses systematic resampling, the resampling will sample every particle for every cycle. On the other hand, if it chooses the rounding copy resampling, the resampling will sample more than one of each cycle’s particle. This illustrates that the method (AMSSDR) being proposed is capable of switching resampling algorithms based on various physical memory requirements. The aim of the authors is to extend this research in the future by applying their proposed method in various emerging applications such as real-time locator systems or medical applications
MapReduce Particle Filtering with Exact Resampling and Deterministic Runtime
Particle filtering is a numerical Bayesian technique that has great potential for solving sequential estimation problems involving non-linear and non-Gaussian models. Since the estimation accuracy achieved by particle filters improves as the number of particles increases, it is natural to consider as many particles as possible. MapReduce is a generic programming model that makes it possible to scale a wide variety of algorithms to Big data. However, despite the application of particle filters across many domains, little attention has been devoted to implementing particle filters using MapReduce. In this paper, we describe an implementation of a particle filter using MapReduce. We focus on a component that what would otherwise be a bottleneck to parallel execution, the resampling component. We devise a new implementation of this component, which requires no approximations, has spatial complexity and deterministic time complexity. Results demonstrate the utility of this new component and culminate in consideration of a particle filter with particles being distributed across processor cores
MapReduce particle filtering with exact resampling and deterministic runtime
Particle filtering is a numerical Bayesian technique that has great potential for solving sequential estimation problems involving non-linear and non-Gaussian models. Since the estimation accuracy achieved by particle filters improves as the number of particles increases, it is natural to consider as many particles as possible. MapReduce is a generic programming model that makes it possible to scale a wide variety of algorithms to Big data. However, despite the application of particle filters across many domains, little attention has been devoted to implementing particle filters using MapReduce. In this paper, we describe an implementation of a particle filter using MapReduce. We focus on a component that what would otherwise be a bottleneck to parallel execution, the resampling component. We devise a new implementation of this component, which requires no approximations, has O(N) spatial complexity and deterministic O((logN)2) time complexity. Results demonstrate the utility of this new component and culminate in consideration of a particle filter with 224 particles being distributed across 512 processor cores
An Introduction to Twisted Particle Filters and Parameter Estimation in Non-linear State-space Models
Twisted particle filters are a class of sequential Monte Carlo methods
recently introduced by Whiteley and Lee to improve the efficiency of marginal
likelihood estimation in state-space models. The purpose of this article is to
extend the twisted particle filtering methodology, establish accessible
theoretical results which convey its rationale, and provide a demonstration of
its practical performance within particle Markov chain Monte Carlo for
estimating static model parameters. We derive twisted particle filters that
incorporate systematic or multinomial resampling and information from
historical particle states, and a transparent proof which identifies the
optimal algorithm for marginal likelihood estimation. We demonstrate how to
approximate the optimal algorithm for nonlinear state-space models with
Gaussian noise and we apply such approximations to two examples: a range and
bearing tracking problem and an indoor positioning problem with Bluetooth
signal strength measurements. We demonstrate improvements over standard
algorithms in terms of variance of marginal likelihood estimates and Markov
chain autocorrelation for given CPU time, and improved tracking performance
using estimated parameters.Comment: This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessibl
Particle Efficient Importance Sampling
The efficient importance sampling (EIS) method is a general principle for the
numerical evaluation of high-dimensional integrals that uses the sequential
structure of target integrands to build variance minimising importance
samplers. Despite a number of successful applications in high dimensions, it is
well known that importance sampling strategies are subject to an exponential
growth in variance as the dimension of the integration increases. We solve this
problem by recognising that the EIS framework has an offline sequential Monte
Carlo interpretation. The particle EIS method is based on non-standard
resampling weights that take into account the look-ahead construction of the
importance sampler. We apply the method for a range of univariate and bivariate
stochastic volatility specifications. We also develop a new application of the
EIS approach to state space models with Student's t state innovations. Our
results show that the particle EIS method strongly outperforms both the
standard EIS method and particle filters for likelihood evaluation in high
dimensions. Moreover, the ratio between the variances of the particle EIS and
particle filter methods remains stable as the time series dimension increases.
We illustrate the efficiency of the method for Bayesian inference using the
particle marginal Metropolis-Hastings and importance sampling squared
algorithms
AUV SLAM and experiments using a mechanical scanning forward-looking sonar
Navigation technology is one of the most important challenges in the applications of autonomous underwater vehicles (AUVs) which navigate in the complex undersea environment. The ability of localizing a robot and accurately mapping its surroundings simultaneously, namely the simultaneous localization and mapping (SLAM) problem, is a key prerequisite of truly autonomous robots. In this paper, a modified-FastSLAM algorithm is proposed and used in the navigation for our C-Ranger research platform, an open-frame AUV. A mechanical scanning imaging sonar is chosen as the active sensor for the AUV. The modified-FastSLAM implements the update relying on the on-board sensors of C-Ranger. On the other hand, the algorithm employs the data association which combines the single particle maximum likelihood method with modified negative evidence method, and uses the rank-based resampling to overcome the particle depletion problem. In order to verify the feasibility of the proposed methods, both simulation experiments and sea trials for C-Ranger are conducted. The experimental results show the modified-FastSLAM employed for the navigation of the C-Ranger AUV is much more effective and accurate compared with the traditional methods
- …