CORE
🇺🇦
make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
research
Scale-adaptive spatial appearance feature density approximation for object tracking
Authors
CY Liu
NHC Yung
Publication date
1 January 2011
Publisher
'Institute of Electrical and Electronics Engineers (IEEE)'
Doi
Cite
Abstract
Object tracking is an essential task in visual traffic surveillance. Ideally, a tracker should be able to accurately capture an object's natural motion such as translation, rotation, and scaling. However, it is well known that object appearance varies due to changes in viewing angle, scale, and illumination. They introduce ambiguity to the image cue on which a visual tracker usually relies and which affects the tracking performance. Thus, a robust image appearance cue is required. This paper proposes scale-adaptive spatial appearance feature density approximation to represent objects and construct the image cue. It is found that the appearance representation improves the sensitivity on both the object's rotation and scale. The image cue is then constructed by both the appearance representation of the object and its surrounding background such that distinguishable parts of an object can be tracked under poor imaging conditions. Moreover, tracking dynamics is integrated with the image cue so that objects are efficiently localized in a gradient-based process. Comparative experiments show that the proposed method is effective in capturing the natural motion of objects and generating better tracking accuracy under different image conditions. © 2010 IEEE.published_or_final_versio
Similar works
Full text
Open in the Core reader
Download PDF
Available Versions
HKU Scholars Hub
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:hub.hku.hk:10722/137288
Last time updated on 01/06/2016