52 research outputs found

    Learning Rotation Adaptive Correlation Filters in Robust Visual Object Tracking

    Full text link
    Visual object tracking is one of the major challenges in the field of computer vision. Correlation Filter (CF) trackers are one of the most widely used categories in tracking. Though numerous tracking algorithms based on CFs are available today, most of them fail to efficiently detect the object in an unconstrained environment with dynamically changing object appearance. In order to tackle such challenges, the existing strategies often rely on a particular set of algorithms. Here, we propose a robust framework that offers the provision to incorporate illumination and rotation invariance in the standard Discriminative Correlation Filter (DCF) formulation. We also supervise the detection stage of DCF trackers by eliminating false positives in the convolution response map. Further, we demonstrate the impact of displacement consistency on CF trackers. The generality and efficiency of the proposed framework is illustrated by integrating our contributions into two state-of-the-art CF trackers: SRDCF and ECO. As per the comprehensive experiments on the VOT2016 dataset, our top trackers show substantial improvement of 14.7% and 6.41% in robustness, 11.4% and 1.71% in Average Expected Overlap (AEO) over the baseline SRDCF and ECO, respectively.Comment: Published in ACCV 201

    A robust tracking system for low frame rate video

    Get PDF
    Tracking in low frame rate (LFR) videos is one of the most important problems in the tracking literature. Most existing approaches treat LFR video tracking as an abrupt motion tracking problem. However, in LFR video tracking applications, LFR not only causes abrupt motions, but also large appearance changes of objects because the objects’ poses and the illumination may undergo large changes from one frame to the next. This adds extra difficulties to LFR video tracking. In this paper, we propose a robust and general tracking system for LFR videos. The tracking system consists of four major parts: dominant color-spatial based object representation, bin-ratio based similarity measure, annealed particle swarm optimization (PSO) based searching, and an integral image based parameter calculation. The first two parts are combined to provide a good solution to the appearance changes, and the abrupt motion is effectively captured by the annealed PSO based searching. Moreover, an integral image of model parameters is constructed, which provides a look-up table for parameters calculation. This greatly reduces the computational load. Experimental results demonstrate that the proposed tracking system can effectively tackle the difficulties caused by LFR

    A framework to integrate particle filters for robust tracking in non-stationary environments

    Get PDF
    Iberian Conference on Pattern Recognition and Image Analysis (IbPRIA), 2005, Estoril (Portugal)In this paper we propose a new framework to integrate several particle filters, in order to obtain a robust tracking system able to cope with abrupt changes of illumination and position of the target. The proposed method is analytically justified and allows to build a tracking procedure that adapts online and simultaneously the colorspace where the image points are represented, the color distributions of the object and background and the contour of the object.This work was supported by projects: 'Navegación autónoma de robots guiados por objetivos visuales' (070-720), 'Integration of robust perception, learning, and navigation systems in mobile robotics' (J-0929).Peer Reviewe

    Probabilistic Object Tracking Based on Machine Learning and Importance Sampling

    No full text

    Robust Tracking under Complex Environments

    No full text

    Image Cues Fusion for Object Tracking Based on Particle Filter

    No full text
    Particle filter is a powerful algorithm to deal with non-linear and non-Gaussian tracking problems. However the algorithm relying only upon one image cue often fails in challenging scenarios. To overcome this, the paper first presents a color likelihood to capture color distribution of the object based on Bhattacharry coe#cient, and a structure likelihood representing high level knowledge regarding the object. Together with the widely used edge likelihood, the paper further proposes a straightforward image cues fusion for object tracking in the framework of particle filter, under assumption that the visual measurement of each image cue is independent of each other. The experiments on real image sequences have shown that the method is e#ective, robust to illumination changes, pose variations and complex background

    Orientation and Scale Invariant Kernel-Based Object Tracking with Probabilistic Emphasizing

    No full text

    Human Tracking by Multiple Kernel Boosting with Locality Affinity Constraints

    No full text

    Detecting and tracking moving targets on omnidirectional vision

    No full text
    • …
    corecore