research article review

Abrupt motion tracking using a visual saliency embedded particle filter

Abstract

Abrupt motion is a significant challenge that commonly causes traditional tracking methods to fail. This paper presents an improved visual saliency model and integrates it to a particle filter tracker to solve this problem. Once the target is lost, our algorithm recovers tracking by detecting the target region from salient regions, which are obtained in the saliency map of current frame. In addition, to strengthen the saliency of target region, the target model is used as a prior knowledge to calculate a weight set which is utilized to construct our improved saliency map adaptively. Furthermore, we adopt the covariance descriptor as the appearance model to describe the object more accurately. Compared with several other tracking algorithms, the experimental results demonstrate that our method is more robust in dealing with various types of abrupt motion scenarios. © 2013 Elsevier Ltd. All rights reserved

Similar works

Full text

thumbnail-image

University of Essex Research Repository

redirect
Last time updated on 09/02/2017

This paper was published in University of Essex Research Repository.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.