4 research outputs found

    Visual tracking: detecting and mapping occlusion and camouflage using process-behaviour charts

    Get PDF
    Visual tracking aims to identify a target object in each frame of an image sequence. It presents an important scientific problem since the human visual system is capable of tracking moving objects in a wide variety of situations. Artificial visual tracking systems also find practical application in areas such as visual surveillance, robotics, biomedical image analysis, medicine and the media. However, automatic visual tracking algorithms suffer from two common problems: occlusion and camouflage. Occlusion arises when another object, usually with different features, comes between the camera and the target. Camouflage occurs when an object with similar features lies behind the target and makes the target invisible from the camera’s point of view. Either of these disruptive events can cause a tracker to lose its target and fail. This thesis focuses on the detection of occlusion and camouflage in a particle-filter based tracking algorithm. Particle filters are commonly used in tracking. Each particle represents a single hypothesis as to the target’s state, with some probability of being correct. The collection of particles tracking a target in each frame of an image sequence is called a particle set. The configuration of that particle set provides vital information about the state of the tracker. The work detailed in this thesis presents three innovative approaches to detecting occlusion and/or camouflage during tracking by evaluating the fluctuating behaviours of the particle set and detecting anomalies using a graphical statistical tool called a process-behaviour chart. The information produced by the process-behaviour chart is then used to map out the boundary of the interfering object, providing valuable information about the viewed environment. A method based on the medial axis of a novel representation of particle distribution termed the Particle History Image was found to perform best over a set of real and artificial test sequences, detecting 90% of occlusion and 100% of camouflage events. Key advantages of the method over previous work in the area are: (1) it is less sensitive to false data and less likely to fire prematurely; (2) it provides a better representation of particle set behaviour by aggregating particles over a longer time period and (3) the use of a training set to parameterise the process-behaviour charts means that comparisons are being made between measurements that are both made over extended time periods, improving reliability

    Kernel-based robust tracking for objects undergoing occlusion

    No full text
    Visual tracking has been a challenging problem in computer vision over the decades. The applications of Visual Tracking are far-reaching, ranging from surveillance and monitoring to smart rooms. Occlusion is one of the major challenges that needs to be handled in tracking. In this work, we propose a new method to track objects undergoing occlusion using both sum-of-squared differences (SSD) and color-based mean-shift (MS) trackers which complement each other by overcoming their respective disadvantages. The rapid model change in SSD tracker is overcome by the MS tracker module, while the inability of MS tracker to handle large displacements is circumvented by the SSD module. Mean-shift tracker, which gained more attention recently, is known for tracking objects in a cluttered environment. Since the MS tracker relies on the global object parameters such as color, the performance of the tracker degrades when the object undergoes partial occlusion. To avoid the adverse effect of this global model, we use the MS tracker so as to track the local object properties instead of a global one. Further a likelihood ratio weighting is used for SSD tracker to avoid drift during partial occlusion and to update the MS tracking modules. The proposed tracker outperforms the traditional MS tracker, as illustrated in the instances applied

    Visual tracking: detecting and mapping occlusion and camouflage using process-behaviour charts

    Get PDF
    Visual tracking aims to identify a target object in each frame of an image sequence. It presents an important scientific problem since the human visual system is capable of tracking moving objects in a wide variety of situations. Artificial visual tracking systems also find practical application in areas such as visual surveillance, robotics, biomedical image analysis, medicine and the media. However, automatic visual tracking algorithms suffer from two common problems: occlusion and camouflage. Occlusion arises when another object, usually with different features, comes between the camera and the target. Camouflage occurs when an object with similar features lies behind the target and makes the target invisible from the camera’s point of view. Either of these disruptive events can cause a tracker to lose its target and fail. This thesis focuses on the detection of occlusion and camouflage in a particle-filter based tracking algorithm. Particle filters are commonly used in tracking. Each particle represents a single hypothesis as to the target’s state, with some probability of being correct. The collection of particles tracking a target in each frame of an image sequence is called a particle set. The configuration of that particle set provides vital information about the state of the tracker. The work detailed in this thesis presents three innovative approaches to detecting occlusion and/or camouflage during tracking by evaluating the fluctuating behaviours of the particle set and detecting anomalies using a graphical statistical tool called a process-behaviour chart. The information produced by the process-behaviour chart is then used to map out the boundary of the interfering object, providing valuable information about the viewed environment. A method based on the medial axis of a novel representation of particle distribution termed the Particle History Image was found to perform best over a set of real and artificial test sequences, detecting 90% of occlusion and 100% of camouflage events. Key advantages of the method over previous work in the area are: (1) it is less sensitive to false data and less likely to fire prematurely; (2) it provides a better representation of particle set behaviour by aggregating particles over a longer time period and (3) the use of a training set to parameterise the process-behaviour charts means that comparisons are being made between measurements that are both made over extended time periods, improving reliability

    Kernel-based robust tracking for objects undergoing occlusion

    No full text
    Visual tracking has been a challenging problem in computer vision over the decades. The applications of Visual Tracking are far-reaching, ranging from surveillance and monitoring to smart rooms. Occlusion is one of the major challenges that needs to be handled in tracking. In this work, we propose a new method to track objects undergoing occlusion using both sum-of-squared differences (SSD) and color-based mean-shift (MS) trackers which complement each other by overcoming their respective disadvantages. The rapid model change in SSD tracker is overcome by the MS tracker module, while the inability of MS tracker to handle large displacements is circumvented by the SSD module. Mean-shift tracker, which gained more attention recently, is known for tracking objects in a cluttered environment. Since the MS tracker relies on the global object parameters such as color, the performance of the tracker degrades when the object undergoes partial occlusion. To avoid the adverse effect of this global model, we use the MS tracker so as to track the local object properties instead of a global one. Further a likelihood ratio weighting is used for SSD tracker to avoid drift during partial occlusion and to update the MS tracking modules. The proposed tracker outperforms the traditional MS tracker, as illustrated in the instances applied
    corecore