4 research outputs found

    Online video streaming for human tracking based on weighted resampling particle filter

    Full text link
    © 2018 The Authors. Published by Elsevier Ltd. This paper proposes a weighted resampling method for particle filter which is applied for human tracking on active camera. The proposed system consists of three major parts which are human detection, human tracking, and camera control. The codebook matching algorithm is used for extracting human region in human detection system, and the particle filter algorithm estimates the position of the human in every input image. The proposed system in this paper selects the particles with highly weighted value in resampling, because it provides higher accurate tracking features. Moreover, a proportional-integral-derivative controller (PID controller) controls the active camera by minimizing difference between center of image and the position of object obtained from particle filter. The proposed system also converts the position difference into pan-tilt speed to drive the active camera and keep the human in the field of view (FOV) camera. The intensity of image changes overtime while tracking human therefore the proposed system uses the Gaussian mixture model (GMM) to update the human feature model. As regards, the temporal occlusion problem is solved by feature similarity and the resampling particles. Also, the particle filter estimates the position of human in every input frames, thus the active camera drives smoothly. The robustness of the accurate tracking of the proposed system can be seen in the experimental results

    Adaptive Feature Selection for Object Tracking with Particle Filter

    No full text
    International audienceObject tracking is an important topic in the field of computer vision. Commonly used color-based trackers are based on a fixed set of color features such as RGB or HSV and, as a result, fail to adapt to changing illumination conditions and background clutter. These drawbacks can be overcome to an extent by using an adaptive framework which selects for each frame of a sequence the features that best discriminate the object from the background. In this paper, we use such an adaptive feature selection method embedded into a particle filter mechanism and show that our tracking method is robust to lighting changes and background distractions. Different experiments also show that the proposed method outperform other approaches

    Object Tracking by Adaptive Feature Extraction

    No full text
    Tracking objects in the high-dimensional feature space is not only computationally expensive and but also functionally inefficient. Selecting a low-dimensional discriminative feature set is a critical step to improve tracker performance. A good feature set for tracking can differ from frame to frame due to the changes in the background against which the tracked object is viewed, and an on-line algorithm to adaptively determine a distinctive feature set would be advantageous. In this paper, multiple heterogeneous features are assembled, and likelihood images are constructed for various subspaces of the combined feature space. Then, the most discriminative feature is extracted by Principal Component Analysis (PCA) based on those likelihood images. This idea is applied to the mean-shift tracking algorithm [1], and we demonstrate its effectiveness through various experiments
    corecore