2 research outputs found

    leave a trace - A People Tracking System Meets Anomaly Detection

    Full text link
    Video surveillance always had a negative connotation, among others because of the loss of privacy and because it may not automatically increase public safety. If it was able to detect atypical (i.e. dangerous) situations in real time, autonomously and anonymously, this could change. A prerequisite for this is a reliable automatic detection of possibly dangerous situations from video data. This is done classically by object extraction and tracking. From the derived trajectories, we then want to determine dangerous situations by detecting atypical trajectories. However, due to ethical considerations it is better to develop such a system on data without people being threatened or even harmed, plus with having them know that there is such a tracking system installed. Another important point is that these situations do not occur very often in real, public CCTV areas and may be captured properly even less. In the artistic project leave a trace the tracked objects, people in an atrium of a institutional building, become actor and thus part of the installation. Visualisation in real-time allows interaction by these actors, which in turn creates many atypical interaction situations on which we can develop our situation detection. The data set has evolved over three years and hence, is huge. In this article we describe the tracking system and several approaches for the detection of atypical trajectories

    Traffic Observation and Situation Assessment

    No full text
    Utilization of camera systems for surveillance tasks (e. g. traffic monitoring) has become a standard procedure and has been in use for over 20 years. However, most of the cameras are operated locally and data analyzed manually. Locally means here a limited field of view and that the image sequences are processed independently from other cameras. For the enlargement of the observation area and to avoid occlusions and non-accessible areas multiple camera systems with overlapping and non-overlapping cameras are used. The joint processing of image sequences of a multi-camera system is a scientific and technical challenge. The processing is divided traditionally into camera calibration, object detection, tracking and interpretation. The fusion of information from different cameras is carried out in the world coordinate system. To reduce the network load, a distributed processing concept can be implemented. Object detection and tracking are fundamental image processing tasks for scene evaluation. Situation assessments are based mainly on characteristic local movement patterns (e.g. directions and speed), from which trajectories are derived. It is possible to recognize atypical movement patterns of each detected object by comparing local properties of the trajectories. Interaction of different objects can also be predicted with an additional classification algorithm. This presentation discusses trajectory based recognition algorithms for atypical event detection in multi object scenes to obtain area based types of information (e.g. maps of speed patterns, trajectory curvatures or erratic movements) and shows that two-dimensional areal data analysis of moving objects with multiple cameras offers new possibilities for situational analysis
    corecore