142 research outputs found

    Coaxial optical structure for iris recognition from a distance

    Get PDF
    Supporting an unconstrained user interface is an important issue in iris recognition. Various methods try to remove the constraint of the iris being placed close to the camera, including portal-based and pan-tiltzoom (PTZ)-based solutions. Generally speaking, a PTZ-based system has two cameras: one scene camera and one iris camera. The scene camera detects the eye’s location and passes this information to the iris camera. The iris camera captures a high-resolution image of the person’s iris. Existing PTZ-based systems are divided into separate types and parallel types, according to how the scene camera and iris camera combine. This paper proposes a novel PTZ-based iris recognition system, in which the iris camera and the scene camera are combined in a coaxial optical structure. The two cameras are placed together orthogonally and a cold mirror is inserted between them, such that the optical axes of the two cameras become coincident. Due to the coaxial optical structure, the proposed system does not need the optical axis displacement-related compensation required in parallel-type systems. Experimental results show that the coaxial type can acquire an iris image more quickly and accurately than a parallel type when the stand-off distance is between 1.0 and 1.5 m. C 2011 Society of Photo-Optical Instrumentation Engineers (SPIE).This work was supported by the Korea Science and Engineering Foundation (KOSEF) through the Biometrics Engineering Research Center (BERC) at Yonsei University [R112002105070020 (2010)]

    A design framework for ISFAR: (an intelligent surveillance system with face recognition).

    Get PDF
    Chan, Fai.Thesis (M.Phil.)--Chinese University of Hong Kong, 2008.Includes bibliographical references (leaves 104-108).Abstracts in English and Chinese.Chapter 1. --- Introduction --- p.14Chapter 1.1. --- Background --- p.14Chapter 1.1.1. --- Introduction to Intelligent Surveillance System (ISS) --- p.14Chapter 1.1.2. --- Typical architecture of Surveillance System --- p.17Chapter 1.1.3. --- Single-camera vs Multi-camera Surveillance System --- p.17Chapter 1.1.4. --- Intelligent Surveillance System with Face Recognition (ISFAR) --- p.20Chapter 1.1.5. --- Minimal requirements for automatic Face Recognition --- p.21Chapter 1.2. --- Motivation --- p.22Chapter 1.3. --- Major Contributions --- p.26Chapter 1.3.1. --- A unified design framework for IS FAR --- p.26Chapter 1.3.2. --- Prototyping of IS FAR (ISFARO) --- p.29Chapter 1.3.3. --- Evaluation of ISFARO --- p.29Chapter 1.4. --- Thesis Organization --- p.30Chapter 2. --- Related Works --- p.31Chapter 2.1. --- Distant Human Identification (DHID) --- p.31Chapter 2.2. --- Distant Targets Identification System --- p.33Chapter 2.3. --- Virtual Vision System with Camera Scheduling --- p.35Chapter 3. --- A unified design framework for IS FAR --- p.37Chapter 3.1. --- Camera system modeling --- p.40Chapter 3.1.1. --- Stereo Triangulation (Human face location estimation) --- p.40Chapter 3.1.2. --- Camera system calibration --- p.42Chapter 3.2. --- Human face detection --- p.44Chapter 3.3. --- Human face tracking --- p.46Chapter 3.4. --- Human face correspondence --- p.50Chapter 3.4.1. --- Information consistency in stereo triangulation --- p.51Chapter 3.4.2. --- Proposed object correspondent algorithm --- p.52Chapter 3.5. --- Human face location and velocity estimation --- p.57Chapter 3.6. --- Human-Camera Synchronization --- p.58Chapter 3.6.1. --- Controlling a PTZ Camera for capturing human facial images --- p.60Chapter 3.6.2. --- Mathematical Formulation of the Human Face Capturing Problem --- p.61Chapter 4. --- Prototyping of lSFAR (ISFARO) --- p.64Chapter 4.1. --- Experiment Setup --- p.64Chapter 4.2. --- Speed of the PTZ camera 一 AXIS 213 PTZ --- p.67Chapter 4.3. --- Performance of human face detection and tracking --- p.68Chapter 4.4. --- Performance of human face correspondence --- p.72Chapter 4.5. --- Performance of human face location estimation --- p.74Chapter 4.6. --- Stability test of the Human-Camera Synchronization model --- p.75Chapter 4.7. --- Performance of ISFARO in capturing human facial images --- p.76Chapter 4.8. --- System Profiling of ISFARO --- p.79Chapter 4.9. --- Summary --- p.79Chapter 5. --- Improvements to ISFARO --- p.80Chapter 5.1. --- System Dynamics oflSFAR --- p.80Chapter 5.2. --- Proposed improvements to ISFARO --- p.82Chapter 5.2.1. --- Semi-automatic camera system calibration --- p.82Chapter 5.2.2. --- Velocity estimation using Kalman filter --- p.83Chapter 5.2.3. --- Reduction in PTZ camera delay --- p.87Chapter 5.2.4. --- Compensation of image blurriness due to motion from human --- p.89Chapter 5.3. --- Experiment Setup --- p.91Chapter 5.4. --- Performance of human face location estimation --- p.91Chapter 5.5. --- Speed of the PTZ Camera - SONY SNC RX-570 --- p.93Chapter 5.6. --- Performance of human face velocity estimation --- p.95Chapter 5.7. --- Performance of improved ISFARO in capturing human facial images --- p.99Chapter 6. --- Conclusions --- p.101Chapter 7. --- Bibliography --- p.10

    CCTV Technology Handbook

    Get PDF
    This CCTV Technology Handbook provides emergency responders, law enforcement security managers, and other security specialists with a reference to aid in planning, designing, and purchasing a CCTV system. This handbook includes a description of the capabilities and limitations of CCTV components used in security applications

    A robust feature tracker for active surveillance of outdoor scenes

    Get PDF
    In this paper, we propose a robust real-time object detection system for outdoor image sequences acquired by an active camera. The system is able to compensate background changes due to the camera motion and to detect mobile objects in the scene. Background compensation is performed by assuming a simple translation (displacement vector) of the background from the previous to the current frame and by applying the well-known tracker proposed by Lucas and Kanade. A reference map containing all well trackable features is maintained and updated by the system at each frame by introducing new good features related to new regions that appear in the current image. A new method is applied to reject badly tracked features. The current frame and the background after compensation are processed by a change detection method in order to locate mobile objects. Results are presented in the contest of a visual-based surveillance system for monitoring outdoor enviroments

    System management algorithms for distributed vision networks

    Get PDF
    [no abstract

    A Photogrammetry-Based Hybrid System for Dynamic Tracking and Measurement

    Get PDF
    Noncontact measurements of lightweight flexible aerospace structures present several challenges. Objects are usually mounted on a test stand because current noncontact measurement techniques require that the net motion of the object be zero. However, it is often desirable to take measurements of the object under operational conditions, and in the case of miniature aerial vehicles (MAVs) and deploying space structures, the test article will undergo significant translational motion. This thesis describes a hybrid noncontact measurement system which will enable measurement of structural kinematics of an object freely moving about a volume. By using a real-time videogrammetry system, a set of pan-tilt-zoom (PTZ) cameras is coordinated to track large-scale net motion and produce high-speed, high-quality images for photogrammetric surface reconstruction. The design of the system is presented in detail. A method of generating the calibration parameters for the PTZ cameras is presented and evaluated and is shown to produce good results. The results of camera synchronization tests and tracking accuracy evaluation are presented as well. Finally, a demonstration of the hybrid system is presented in which all four PTZ cameras track an MAV in flight
    corecore