10,743 research outputs found

    Non-overlapping Distributed Tracking System Utilizing Particle Filter

    Get PDF
    Tracking people across multiple cameras is a challenging research area in visual computing, especially when these cameras have non-overlapping field of views. The important task is to associate a current subject with other prior appearances of the same subject across time and space in a camera network. Several known techniques rely on Bayesian approaches to perform the matching task. However, these approaches do not scale well when the dimension of the problem increases; e.g. when the number of subject or possible path increases. The aim of this paper is to propose a unified tracking framework using particle filters to efficiently switch between visual tracking (field of view tracking) and track prediction (non-overlapping region tracking). The particle filter tracking system utilizes a map (known environment) to assist the tracking process when targets leave the field of view of any camera. We implemented and tested this tracking approach in an in-house multiple cameras system as well as using on-line data. Promising results were obtained which suggested the feasibility of such an approach

    Diffusion Maps Kalman Filter for a Class of Systems with Gradient Flows

    Full text link
    In this paper, we propose a non-parametric method for state estimation of high-dimensional nonlinear stochastic dynamical systems, which evolve according to gradient flows with isotropic diffusion. We combine diffusion maps, a manifold learning technique, with a linear Kalman filter and with concepts from Koopman operator theory. More concretely, using diffusion maps, we construct data-driven virtual state coordinates, which linearize the system model. Based on these coordinates, we devise a data-driven framework for state estimation using the Kalman filter. We demonstrate the strengths of our method with respect to both parametric and non-parametric algorithms in three tracking problems. In particular, applying the approach to actual recordings of hippocampal neural activity in rodents directly yields a representation of the position of the animals. We show that the proposed method outperforms competing non-parametric algorithms in the examined stochastic problem formulations. Additionally, we obtain results comparable to classical parametric algorithms, which, in contrast to our method, are equipped with model knowledge.Comment: 15 pages, 12 figures, submitted to IEEE TS

    Gestures Everywhere: A Multimodal Sensor Fusion and Analysis Framework for Pervasive Displays

    Get PDF
    Gestures Everywhere is a dynamic framework for multimodal sensor fusion, pervasive analytics and gesture recognition. Our framework aggregates the real-time data from approximately 100 sensors that include RFID readers, depth cameras and RGB cameras distributed across 30 interactive displays that are located in key public areas of the MIT Media Lab. Gestures Everywhere fuses the multimodal sensor data using radial basis function particle filters and performs real-time analysis on the aggregated data. This includes key spatio-temporal properties such as presence, location and identity; in addition to higher-level analysis including social clustering and gesture recognition. We describe the algorithms and architecture of our system and discuss the lessons learned from the systems deployment

    Multisensor Poisson Multi-Bernoulli Filter for Joint Target-Sensor State Tracking

    Full text link
    In a typical multitarget tracking (MTT) scenario, the sensor state is either assumed known, or tracking is performed in the sensor's (relative) coordinate frame. This assumption does not hold when the sensor, e.g., an automotive radar, is mounted on a vehicle, and the target state should be represented in a global (absolute) coordinate frame. Then it is important to consider the uncertain location of the vehicle on which the sensor is mounted for MTT. In this paper, we present a multisensor low complexity Poisson multi-Bernoulli MTT filter, which jointly tracks the uncertain vehicle state and target states. Measurements collected by different sensors mounted on multiple vehicles with varying location uncertainty are incorporated sequentially based on the arrival of new sensor measurements. In doing so, targets observed from a sensor mounted on a well-localized vehicle reduce the state uncertainty of other poorly localized vehicles, provided that a common non-empty subset of targets is observed. A low complexity filter is obtained by approximations of the joint sensor-feature state density minimizing the Kullback-Leibler divergence (KLD). Results from synthetic as well as experimental measurement data, collected in a vehicle driving scenario, demonstrate the performance benefits of joint vehicle-target state tracking.Comment: 13 pages, 7 figure
    • …
    corecore