research

Scale-adaptive spatial appearance feature density approximation for object tracking

Abstract

Object tracking is an essential task in visual traffic surveillance. Ideally, a tracker should be able to accurately capture an object's natural motion such as translation, rotation, and scaling. However, it is well known that object appearance varies due to changes in viewing angle, scale, and illumination. They introduce ambiguity to the image cue on which a visual tracker usually relies and which affects the tracking performance. Thus, a robust image appearance cue is required. This paper proposes scale-adaptive spatial appearance feature density approximation to represent objects and construct the image cue. It is found that the appearance representation improves the sensitivity on both the object's rotation and scale. The image cue is then constructed by both the appearance representation of the object and its surrounding background such that distinguishable parts of an object can be tracked under poor imaging conditions. Moreover, tracking dynamics is integrated with the image cue so that objects are efficiently localized in a gradient-based process. Comparative experiments show that the proposed method is effective in capturing the natural motion of objects and generating better tracking accuracy under different image conditions. © 2010 IEEE.published_or_final_versio

    Similar works