Placement, visibility and coverage analysis of dynamic pan/tilt/zoom camera sensor networks

Abstract

Multi-camera vision systems have important application in a number of fields, including robotics and security. One interesting problem related to multi-camera vision systems is to determine the effect of camera placement on the quality of service provided by a network of Pan/Tilt/Zoom (PTZ) cameras with respect to a specific image processing application. The goal of this work is to investigate how to place a team of PTZ cameras, potentially used for collaborative tasks, such as surveillance, and analyze the dynamic coverage that can be provided by them. Computational Geometry approaches to various formulations of sensor placement problems have been shown to offer very elegant solutions; however, they often involve unrealistic assumptions about real-world sensors, such as infinite sensing range and infinite rotational speed. Other solutions to camera placement have attempted to account for the constraints of real-world computer vision applications, but offer solutions that are approximations over a discrete problem space. A contribution of this work is an algorithm for camera placement that leverages Computational Geometry principles over a continuous problem space utilizing a model for dynamic camera coverage that is simple, yet representative. This offers a balance between accounting for real-world application constraints and creating a problem that is tractable

    Similar works