2,416 research outputs found

    Single and multiple target tracking via hybrid mean shift/particle filter algorithms

    Get PDF
    This thesis is concerned with single and multiple target visual tracking algorithms and their application in the real world. While they are both powerful and general, one of the main challenges of tracking using particle filter-based algorithms is to manage the particle spread. Too wide a spread leads to dispersal of particles onto clutter, but limited spread may lead to difficulty when fast-moving objects and/or high-speed camera motion throw trackers away from their target(s). This thesis addresses the particle spread management problem. Three novel tracking algorithms are presented, each of which combines particle filtering and Kernel Mean Shift methods to produce more robust and accurate tracking. The first single target tracking algorithm, the Structured Octal Kernel Filter (SOK), combines Mean Shift (Comaniciu et al 2003) and Condensation (Isard and Blake 1998a). The spread of the particle set is handled by structurally placing the particles around the object, using eight particles arranged to cover the maximum area. Mean Shift is then applied to each particle to seek the global maxima. In effect, SOK uses intelligent switching between Mean Shift and particle filtering based on a confidence level. Though effective, it requires a threshold to be set and performs a somewhat inflexible search. The second single target tracking algorithm, the Kernel Annealed Mean Shift tracker (KAMS), uses an annealed particle filter (Deutscher et al 2000), but introduces a Mean Shift step to control particle spread. As a result, higher accuracy and robustness are achieved using fewer particles and annealing levels. Finally, KAMS is extended to create a multi-object tracking algorithm (MKAMS) by introducing an interaction filter to handle object collisions and occlusions. All three algorithms are compared experimentally with existing single/multiple object tracking algorithms. The evaluation procedure compares competing algorithms' robustness, accuracy and computational cost using both numerical measures and a novel application of McNemar's statistic. Results are presented on a wide variety of artificial and real image sequences

    Fight sample degeneracy and impoverishment in particle filters: A review of intelligent approaches

    Get PDF
    During the last two decades there has been a growing interest in Particle Filtering (PF). However, PF suffers from two long-standing problems that are referred to as sample degeneracy and impoverishment. We are investigating methods that are particularly efficient at Particle Distribution Optimization (PDO) to fight sample degeneracy and impoverishment, with an emphasis on intelligence choices. These methods benefit from such methods as Markov Chain Monte Carlo methods, Mean-shift algorithms, artificial intelligence algorithms (e.g., Particle Swarm Optimization, Genetic Algorithm and Ant Colony Optimization), machine learning approaches (e.g., clustering, splitting and merging) and their hybrids, forming a coherent standpoint to enhance the particle filter. The working mechanism, interrelationship, pros and cons of these approaches are provided. In addition, approaches that are effective for dealing with high-dimensionality are reviewed. While improving the filter performance in terms of accuracy, robustness and convergence, it is noted that advanced techniques employed in PF often causes additional computational requirement that will in turn sacrifice improvement obtained in real life filtering. This fact, hidden in pure simulations, deserves the attention of the users and designers of new filters

    Tracking moving objects in surveillance video

    Get PDF
    The thesis looks at approaches to the detection and tracking of potential objects of interest in surveillance video. The aim was to investigate and develop methods that might be suitable for eventual application through embedded software, running on a fixed-point processor, in analytics capable cameras. The work considers common approaches to object detection and representation, seeking out those that offer the necessary computational economy and the potential to be able to cope with constraints such as low frame rate due to possible limited processor time, or weak chromatic content that can occur in some typical surveillance contexts. The aim is for probabilistic tracking of objects rather than simple concatenation of frame by frame detections. This involves using recursive Bayesian estimation. The particle filter is a technique for implementing such a recursion and so it is examined in the context of both single target and combined multi-target tracking. A detailed examination of the operation of the single target tracking particle filter shows that objects can be tracked successfully using a relatively simple structured grey-scale histogram representation. It is shown that basic components of the particle filter can be simplified without loss in tracking quality. An analysis brings out the relationships between commonly used target representation distance measures and shows that in the context of the particle filter there is little to choose between them. With the correct choice of parameters, the simplest and computationally economic distance measure performs well. The work shows how to make that correct choice. Similarly, it is shown that a simple measurement likelihood function can be used in place of the more ubiquitous Gaussian. The important step of target state estimation is examined. The standard weighted mean approach is rejected, a recently proposed maximum a posteriori approach is shown to be not suitable in the context of the work, and a practical alternative is developed. Two methods are presented for tracker initialization. One of them is a simplification of an existing published method, the other is a novel approach. The aim is to detect trackable objects as they enter the scene, extract trackable features, then actively follow those features through subsequent frames. The multi-target tracking problem is then posed as one of management of multiple independent trackers

    Fast algorithm for real-time rings reconstruction

    Get PDF
    The GAP project is dedicated to study the application of GPU in several contexts in which real-time response is important to take decisions. The definition of real-time depends on the application under study, ranging from answer time of μs up to several hours in case of very computing intensive task. During this conference we presented our work in low level triggers [1] [2] and high level triggers [3] in high energy physics experiments, and specific application for nuclear magnetic resonance (NMR) [4] [5] and cone-beam CT [6]. Apart from the study of dedicated solution to decrease the latency due to data transport and preparation, the computing algorithms play an essential role in any GPU application. In this contribution, we show an original algorithm developed for triggers application, to accelerate the ring reconstruction in RICH detector when it is not possible to have seeds for reconstruction from external trackers

    Adaptive tracking via multiple appearance models and multiple linear searches

    Get PDF
    We introduce a unified tracker (FMCMC-MM) which adapts to changes in target appearance by combining two popular generative models: templates and histograms, maintaining multiple instances of each in an appearance pool, and enhances prediction by utilising multiple linear searches. These search directions are sparse estimates of motion direction derived from local features stored in a feature pool. Given only an initial template representation of the target, the proposed tracker can learn appearance changes in a supervised manner and generate appropriate target motions without knowing the target movement in advance. During tracking, it automatically switches between models in response to variations in target appearance, exploiting the strengths of each model component. New models are added, automatically, as necessary. The effectiveness of the approach is demonstrated using a variety of challenging video sequences. Results show that this framework outperforms existing appearance based tracking frameworks

    Multiple-Target Tracking in Complex Scenarios

    Get PDF
    In this dissertation, we develop computationally efficient algorithms for multiple-target tracking: MTT) in complex scenarios. For each of these scenarios, we develop measurement and state-space models, and then exploit the structure in these models to propose efficient tracking algorithms. In addition, we address design issues such as sensor selection and resource allocation. First, we consider MTT when the targets themselves are moving in a time-varying multipath environment. We develop a sparse-measurement model that allows us to exploit the inherent joint delay-Doppler diversity offered by the environment. We then reformulate the problem of MTT as a block-support recovery problem using the sparse measurement model. We exploit the structure of the dictionary matrix to develop a computationally efficient block support recovery algorithm: and thereby a multiple-target tracking algorithm) under the assumption that the channel state describing the time-varying multipath environment is known. Further, we also derive an upper bound on the overall error probability of wrongly identifying the support of the sparse signal. We then relax the assumption that the channel state is known. We develop a new particle filter called the Multiple Rao-Blackwellized Particle Filter: MRBPF) to jointly estimate both the target and the channel states. We also compute the posterior Cramér-Rao bound: PCRB) on the estimates of the target and the channel states and use the PCRB to find a suitable subset of antennas to be used for transmission in each tracking interval, as well as the power transmitted by these antennas. Second, we consider the problem of tracking an unknown number and types of targets using a multi-modal sensor network. In a multi-modal sensor network, different quantities associated with the same state are measured using sensors of different kinds. Hence, an efficient method that can suitably combine the diverse information measured by each sensor is required. We first develop a Hierarchical Particle Filter: HPF) to estimate the unknown state from the multi-modal measurements for a special class of problems which can be modeled hierarchically. We then model our problem of tracking using a hierarchical model and then use the proposed HPF for joint initiation, termination and tracking of multiple targets. The multi-modal data consists of the measurements collected from a radar, an infrared camera and a human scout. We also propose a unified framework for multi-modal sensor management that comprises sensor selection: SS), resource allocation: RA) and data fusion: DF). Our approach is inspired by the trading behavior of economic agents in commercial markets. We model the sensors and the sensor manager as economic agents, and the interaction among them as a double sided market with both consumers and producers. We propose an iterative double auction mechanism for computing the equilibrium of such a market. We relate the equilibrium point to the solutions of SS, RA and DF. Third, we address MTT problem in the presence of data association ambiguity that arises due to clutter. Data association corresponds to the problem of assigning a measurement to each target. We treat the data association and state estimation as separate subproblems. We develop a game-theoretic framework to solve the data association, in which we model each tracker as a player and the set of measurements as strategies. We develop utility functions for each player, and then use a regret-based learning algorithm to find the correlated equilibrium of this game. The game-theoretic approach allows us to associate measurements to all the targets simultaneously. We then use particle filtering on the reduced dimensional state of each target, independently

    Computational intelligence approaches to robotics, automation, and control [Volume guest editors]

    Get PDF
    No abstract available

    Faster inference from state space models via GPU computing

    Get PDF
    Funding: C.F.-J. is funded via a doctoral scholarship from the University of St Andrews, School of Mathematics and Statistics.Inexpensive Graphics Processing Units (GPUs) offer the potential to greatly speed up computation by employing their massively parallel architecture to perform arithmetic operations more efficiently. Population dynamics models are important tools in ecology and conservation. Modern Bayesian approaches allow biologically realistic models to be constructed and fitted to multiple data sources in an integrated modelling framework based on a class of statistical models called state space models. However, model fitting is often slow, requiring hours to weeks of computation. We demonstrate the benefits of GPU computing using a model for the population dynamics of British grey seals, fitted with a particle Markov chain Monte Carlo algorithm. Speed-ups of two orders of magnitude were obtained for estimations of the log-likelihood, compared to a traditional ‘CPU-only’ implementation, allowing for an accurate method of inference to be used where this was previously too computationally expensive to be viable. GPU computing has enormous potential, but one barrier to further adoption is a steep learning curve, due to GPUs' unique hardware architecture. We provide a detailed description of hardware and software setup, and our case study provides a template for other similar applications. We also provide a detailed tutorial-style description of GPU hardware architectures, and examples of important GPU-specific programming practices.Publisher PDFPeer reviewe

    Simulating Hydrodynamics in Cosmology with CRK-HACC

    Full text link
    We introduce CRK-HACC, an extension of the Hardware/Hybrid Accelerated Cosmology Code (HACC), to resolve gas hydrodynamics in large-scale structure formation simulations of the universe. The new framework couples the HACC gravitational N-body solver with a modern smoothed particle hydrodynamics (SPH) approach called CRKSPH. C‾\underline{\text{C}}onservative R‾\underline{\text{R}}eproducing K‾\underline{\text{K}}ernel SPH‾\underline{\text{SPH}} utilizes smoothing functions that exactly interpolate linear fields while manifestly preserving conservation laws (momentum, mass, and energy). The CRKSPH method has been incorporated to accurately model baryonic effects in cosmology simulations - an important addition targeting the generation of precise synthetic sky predictions for upcoming observational surveys. CRK-HACC inherits the codesign strategies of the HACC solver and is built to run on modern GPU-accelerated supercomputers. In this work, we summarize the primary solver components and present a number of standard validation tests to demonstrate code accuracy, including idealized hydrodynamic and cosmological setups, as well as self-similarity measurements
    • …
    corecore