Assisted by

Abstract

Object tracking is the complex task to follow a given object in a video stream. This paper describes an algorithm which combines an optical flow based feature tracker with color segmentation. The aim is to build a feature model and reconstruct lost feature points when they are lost due to occlusion or tracking errors. These feature points are tracked from one frame to another with the Lucas & Kanade optical flow algorithm. Additionally, we segment each frame with the Felzenszwalb-Huttenlocher graph-based segmentation algorithm. Optical flow and segmentation are then combined to track an object in a video scene. By using this strategy, also occlusion and slight rotation or deformation can be handled. The tracker is then evaluated on an artificial video sequence with moving balls but also on real-world sequences of a moving person. For all video sequences, ground truth data is available and compared to our results.

    Similar works

    Full text

    thumbnail-image

    Available Versions