94,185 research outputs found

    Autonomous real-time surveillance system with distributed IP cameras

    Get PDF
    An autonomous Internet Protocol (IP) camera based object tracking and behaviour identification system, capable of running in real-time on an embedded system with limited memory and processing power is presented in this paper. The main contribution of this work is the integration of processor intensive image processing algorithms on an embedded platform capable of running at real-time for monitoring the behaviour of pedestrians. The Algorithm Based Object Recognition and Tracking (ABORAT) system architecture presented here was developed on an Intel PXA270-based development board clocked at 520 MHz. The platform was connected to a commercial stationary IP-based camera in a remote monitoring station for intelligent image processing. The system is capable of detecting moving objects and their shadows in a complex environment with varying lighting intensity and moving foliage. Objects moving close to each other are also detected to extract their trajectories which are then fed into an unsupervised neural network for autonomous classification. The novel intelligent video system presented is also capable of performing simple analytic functions such as tracking and generating alerts when objects enter/leave regions or cross tripwires superimposed on live video by the operator

    Design and Implementation of Moving Object Visual Tracking System using μ-Synthesis Controller

    Get PDF
    Considering the increasing use of security and surveillance systems, moving object tracking systems are an interesting research topic in the field of computer vision. In general, a moving object tracking system consists of two integrated parts, namely the video tracking part that predicts the position of the target in the image plane, and the visual servo part that controls the movement of the camera following the movement of objects in the image plane. For tracking purposes, the camera is used as a visual sensor and applied to a 2-DOF (yaw-pitch) manipulator platform with an eye-in-hand camera configuration. Although its operation is relatively simple, the yaw-pitch camera platform still needs a good control method to improve its performance. In this study, we propose a moving object tracking system on a prototype yaw-pitch platform. A m-synthesis controller was used to control the movement of the visual servo part and keep the target in the center of the image plane. The experimental results showed relatively good results from the proposed system to work in real-time conditions with high tracking accuracy in both indoor and outdoor environments

    3GSS Intelligent Tracking System using CAMSHIFT Algorithm

    Get PDF
    The aim of this system is to track a person of interest in real time based on scenes obtained from camera modules. This system avoids potential human errors as we are using automatic monitoring. It is a histogram based real time tracking system. Moving objects are detected using background subtraction.Moving object detection, recognition and tracking are basic steps in processing video frames

    Real Time Object Detection & Tracking System (locally and remotely) with Rotating Camera

    Get PDF
    The task of real time detection and tracking of a moving object in a video stream is quite challenging if camera itself is moving. This paper presents an implementation of real time detection and tracking of an unknown object in video stream with 360° (azimuth) rotating camera. It also presents adaption of different object tracking algorithms and their effect on implementation. The system described in this paper contains a camera that is connected to an embedded system (standalone board) or PC/laptop. They (board/PC) are having an image processing algorithm which detects an object first and then tracks it as long as it is in the line of sight of the camera. As the object moves, the PC/laptop/embedded Board gives signal to motor to rotate the camera which is mounted on a stepper motor. To monitor Object in video user can have multiple options. If user is using laptop/PC to track object it is very simple for him because he already has a screen but in case of embedded board user can monitor the activity of the object of interest using HDMI output or streaming video on WEB server. The object can be defined directly by the end user by selecting a portion of the frame in video stream. The embedded board/PC also saves the video stream in a storage device for playback purpose. DOI: 10.17762/ijritcc2321-8169.150512

    Robust object tracking algorithms using C++ and MATLAB

    Get PDF
    Object tracking, all in all, is a testing issue. Troubles in tracking objects emerge because of unexpected motion of the object, scene appearance change, object appearance change, structures of objects that are not rigid. Besides this full and partial occlusions and motion of the camera also pose challenges. Commonly, we make some assumptions to oblige the tracking issue in the connection of a specific provision. Ordinarily it gets important to track all the moving objects in the real time video. Tracking using colour performs well when the colour of the target is unique compared to its background. Tracking using the contours as a feature is very effective even for non-rigid targets. Tracking using spatial histogram gives satisfactory results even though the target object undergoes size change or has similar coloured background. In this project robust algorithms based on colour, contour and spatiograms to track moving objects have been studied, proposed and implemented

    Realtime object extraction and tracking with an active camera using image mosaics

    Get PDF
    [[abstract]]Moving object extraction plays a key role in applications such as object-based videoconference, surveillance, and so on. The dimculties of moving object segmentation lie in the fact that physical objects are normally not homogeneous with to low-level features and it's usually tough to segment them accnrately and efficiently. Object segmentation based on prestored background information has proved to be effective and efficient in several applications such as videophone, video conferencing, and surveillance, etc. The previous works, however, were mainly concentrated on object segmentation with a static camera and in a stationary background. In this paper, we propose a robust and fast segmentation algorithm and a reliable tracking strategy without knowing the shape of the object in advance. The proposed system can real-time extract the foreground from the background and track the moving object with an active (pan-tilt) camera such that the moving object always stays around the center of images.[[fileno]]2030144030033[[department]]電機工程學

    Optical Flow Background Estimation for Real-time Pan/tilt Camera Object Tracking

    Get PDF
    As Computer Vision (CV) techniques develop, pan/tilt camera systems are able to enhance data capture capabilities over static camera systems. In order for these systems to be effective for metrology purposes, they will need to respond to the test article in real-time with a minimum of additional uncertainty. A methodology is presented here for obtaining high-resolution, high frame-rate images, of objects traveling at speeds ⩾1.2 m/s at 1 m from the camera by tracking the moving texture of an object. Strong corners are determined and used as flow points using implementations on a graphic processing unit (GPU), resulting in significant speed-up over central processing units (CPU). Based on directed pan/tilt motion, a pixel-to-pixel relationship is used to estimate whether optical flow points fit background motion, dynamic motion or noise. To smooth variation, a two-dimensional position and velocity vector is used with a Kalman filter to predict the next required position of the camera so the object stays centered in the image. High resolution images can be stored by a parallel process resulting in a high frame rate procession of images for post-processing. The results provide real-time tracking on a portable system using a pan/tilt unit for generic moving targets where no training is required and camera motion is observed from high accuracy encoders opposed to image correlation

    e-TLD: Event-based Framework for Dynamic Object Tracking

    Full text link
    This paper presents a long-term object tracking framework with a moving event camera under general tracking conditions. A first of its kind for these revolutionary cameras, the tracking framework uses a discriminative representation for the object with online learning, and detects and re-tracks the object when it comes back into the field-of-view. One of the key novelties is the use of an event-based local sliding window technique that tracks reliably in scenes with cluttered and textured background. In addition, Bayesian bootstrapping is used to assist real-time processing and boost the discriminative power of the object representation. On the other hand, when the object re-enters the field-of-view of the camera, a data-driven, global sliding window detector locates the object for subsequent tracking. Extensive experiments demonstrate the ability of the proposed framework to track and detect arbitrary objects of various shapes and sizes, including dynamic objects such as a human. This is a significant improvement compared to earlier works that simply track objects as long as they are visible under simpler background settings. Using the ground truth locations for five different objects under three motion settings, namely translation, rotation and 6-DOF, quantitative measurement is reported for the event-based tracking framework with critical insights on various performance issues. Finally, real-time implementation in C++ highlights tracking ability under scale, rotation, view-point and occlusion scenarios in a lab setting.Comment: 11 pages, 10 figure

    Real-time moving objects tracking for distributed smart video surveillances

    Get PDF
    Tracking the object of interest within a camera's view is essential for crime prevention. This study focuses on analyzing video surveillance in public places. It presents a novel approach to track moving objects across non-overlapping cameras' views that is able to give a consistent label to the objects throughout the whole multi-camera system in real-time. The proposed algorithm is also expected to be able to handle common problems in multiple-camera object tracking including variation of poses, object appearances and occlusion problems. The proposed algorithm was formulated based on visual and temporal cues for multiple cameras using entering/exiting and merging/splitting cases to deal with appearance changes and occlusion problems. Spatial cues are adopted in single-camera object tracking for real-time performance. A novel object segmentation technique based on the observed mask binary value is presented to deal with pose variation across different cameras. In the result section, the comparison between past works and the proposed tracking algorithm are presented. The experimental result
    corecore