1,140 research outputs found

    CMOS Vision Sensors: Embedding Computer Vision at Imaging Front-Ends

    Get PDF
    CMOS Image Sensors (CIS) are key for imaging technol-ogies. These chips are conceived for capturing opticalscenes focused on their surface, and for delivering elec-trical images, commonly in digital format. CISs may incor-porate intelligence; however, their smartness basicallyconcerns calibration, error correction and other similartasks. The term CVISs (CMOS VIsion Sensors) definesother class of sensor front-ends which are aimed at per-forming vision tasks right at the focal plane. They havebeen running under names such as computational imagesensors, vision sensors and silicon retinas, among others. CVIS and CISs are similar regarding physical imple-mentation. However, while inputs of both CIS and CVISare images captured by photo-sensors placed at thefocal-plane, CVISs primary outputs may not be imagesbut either image features or even decisions based on thespatial-temporal analysis of the scenes. We may hencestate that CVISs are more “intelligent” than CISs as theyfocus on information instead of on raw data. Actually,CVIS architectures capable of extracting and interpretingthe information contained in images, and prompting reac-tion commands thereof, have been explored for years inacademia, and industrial applications are recently ramp-ing up.One of the challenges of CVISs architects is incorporat-ing computer vision concepts into the design flow. Theendeavor is ambitious because imaging and computervision communities are rather disjoint groups talking dif-ferent languages. The Cellular Nonlinear Network Univer-sal Machine (CNNUM) paradigm, proposed by Profs.Chua and Roska, defined an adequate framework forsuch conciliation as it is particularly well suited for hard-ware-software co-design [1]-[4]. This paper overviewsCVISs chips that were conceived and prototyped at IMSEVision Lab over the past twenty years. Some of them fitthe CNNUM paradigm while others are tangential to it. Allthem employ per-pixel mixed-signal processing circuitryto achieve sensor-processing concurrency in the quest offast operation with reduced energy budget.Junta de Andalucía TIC 2012-2338Ministerio de Economía y Competitividad TEC 2015-66878-C3-1-R y TEC 2015-66878-C3-3-

    Development of an image converter of radical design

    Get PDF
    A long term investigation of thin film sensors, monolithic photo-field effect transistors, and epitaxially diffused phototransistors and photodiodes to meet requirements to produce acceptable all solid state, electronically scanned imaging system, led to the production of an advanced engineering model camera which employs a 200,000 element phototransistor array (organized in a matrix of 400 rows by 500 columns) to secure resolution comparable to commercial television. The full investigation is described for the period July 1962 through July 1972, and covers the following broad topics in detail: (1) sensor monoliths; (2) fabrication technology; (3) functional theory; (4) system methodology; and (5) deployment profile. A summary of the work and conclusions are given, along with extensive schematic diagrams of the final solid state imaging system product

    Hardware Implementaion of Image Acquisition system using FPGA & ARM

    Get PDF
    In this paper, image data acquisition system has been introduced. In this task, a system of high-speed image data acquisition based on ARM and FPGA is designed according to the needs of actual system in image data transmission, which can be used in data monitoring and surveillance systems. The choice of ARM is a 32-bit embedded RISC microprocessor architecture, which has a rich instruction set and programming flexibility. FPGA has a great advantage in the speed and parallel computing, suitable for real-time requirements of image processing. The interface between camera module and FPGA as well as interface between FPGA and ARM is done using UART. Image from ARM is transmitted to PC using Ethernet

    Efficient Embedded Hardware Architecture for Stabilised Tracking Sighting System of Armoured Fighting Vehicles

    Get PDF
    A line-of-sight stabilised sighting system, capable of target tracking and video stabilisation is a prime requirement of any armoured fighting tank vehicle for military surveillance and weapon firing. Typically, such sighting systems have three prime electro-optical sensors i.e. day camera for viewing in day conditions, thermal camera for night viewing and eye-safe laser range finder for obtaining the target range. For laser guided missile firing, additional laser target designator may be a part of sighting system. This sighting system provides necessary parameters for the fire control computer to compute ballistic offsets to fire conventional ammunition or fire missile. System demands simultaneous interactions with electro-optical sensors, servo sensors, actuators, multi-function display for man-machine interface, fire control computer, logic controller and other sub-systems of tank. Therefore, a complex embedded electronics hardware is needed to respond in real time for such system. An efficient electronics embedded hardware architecture is presented here for the development of this type of sighting system. This hardware has been developed around SHARC 21369 processor and FPGA. A performance evaluation scheme is also presented for this sighting system based on the developed hardware

    BLUETOOTH-BASED REMOTE MOBILE SURVEILLANCE ROBOT

    Get PDF
    The "Bluetooth-Based Remote Mobile Surveillance Robot" is a project where a robot is controlled by a computer using Bluetooth as communication medium. The robot, constructed mainly of a Bluetooth module, microcontroller, servo motors and wheels, will move according to the instructions from the computer whilst a camera mounted on the robot will send live video feed via Radio Frequency for surveillance purposes. This project explores the possibility of combining the use of Bluetooth with microcontrollers whilst serving the purpose of the project. This project will be useful for a cheaper and lower power consumption alternative on mobile surveillances in the market. Besides, the project can be easily expanded to other features such as robotic hands, sensors and other

    Self-Contained Avionics Sensing and Flight Control System for Small Unmanned Aerial Vehicle

    Get PDF
    A self-contained avionics sensing and flight control system is provided for an unmanned aerial vehicle (UAV). The system includes sensors for sensing flight control parameters and surveillance parameters, and a Global Positioning System (GPS) receiver. Flight control parameters and location signals are processed to generate flight control signals. A Field Programmable Gate Array (FPGA) is configured to provide a look-up table storing sets of values with each set being associated with a servo mechanism mounted on the UAV and with each value in each set indicating a unique duty cycle for the servo mechanism associated therewith. Each value in each set is further indexed to a bit position indicative of a unique percentage of a maximum duty cycle for the servo mechanism associated therewith. The FPGA is further configured to provide a plurality of pulse width modulation (PWM) generators coupled to the look-up table. Each PWM generator is associated with and adapted to be coupled to one of the servo mechanisms

    A new high-speed IR camera system

    Get PDF
    A multi-organizational team at the Goddard Space Flight Center is developing a new far infrared (FIR) camera system which furthers the state of the art for this type of instrument by the incorporating recent advances in several technological disciplines. All aspects of the camera system are optimized for operation at the high data rates required for astronomical observations in the far infrared. The instrument is built around a Blocked Impurity Band (BIB) detector array which exhibits responsivity over a broad wavelength band and which is capable of operating at 1000 frames/sec, and consists of a focal plane dewar, a compact camera head electronics package, and a Digital Signal Processor (DSP)-based data system residing in a standard 486 personal computer. In this paper we discuss the overall system architecture, the focal plane dewar, and advanced features and design considerations for the electronics. This system, or one derived from it, may prove useful for many commercial and/or industrial infrared imaging or spectroscopic applications, including thermal machine vision for robotic manufacturing, photographic observation of short-duration thermal events such as combustion or chemical reactions, and high-resolution surveillance imaging

    Hand Gesture Based Surveillance Robot

    Get PDF
    In this work, a hardware and software based integrated system is developed for hand gesture based surveillance robot. The proposed system is a non-invasive technique and software part of the system uses gesture based image processing technique. The hardware part is developed based on AVR microcontroller platform. The captured image of hand is segmented and its contour is determined. The convexity defects are computed to detect the number of fingers used by the subject. The number of fingers directs the path to robot that is to be followed. The camera placed on the robot capture the images of its surrounding, wherever it travels and send it back to the PC for monitoring. In this way, it can be used as a surveillance system. Experimental results show that the overall accuracy obtained above 90% for gesture recognition by which robot will be directed to follow the path. The system can be directly applied to defence grounds for detection of enemy, for spying purpose where the human reach is avoided or not recommended. This unit can be used for overcoming physical handicaps by helping in development of gesture-based wheel chairs, for control of home devices and appliances for persons with physical handicaps and/or elderly users with impaired mobility

    WIRELESS PANT AND TILT SURVEILLANCE PLATFORM

    Get PDF
    The main objective of this project is to have a wireless pan and tilt platform. One platform will track an object or person of interest until it is out of the camera viewfield or platform limit. The next camera on the second platform will continue tracking the image. This project involves the study on wireless system available and how these wireless systems can be implemented into the project. The examples of wireless system available are Bluetooth, wifi, Radio frequency and infra red. Research has been made into the availability of each wireless system and the task is focusing on the transferring the input data wirelessly from computer to the servo motor by using RF and also bluetooth module. Testing method is developed by doing some experiment in transferring the data and the performance measurements is taking into consideration in completing the project. For the RF module, C codes for PIC microcontroller at the transmitter and receiver side is developed. For the Bluetooth module, the device is setup so that to transfer data wirelessly once the link is established
    corecore