48 research outputs found

    Real-time architecture for robust motion estimation under varying illumination conditions

    Get PDF
    Motion estimation from image sequences is a complex problem which requires high computing resources and is highly affected by changes in the illumination conditions in most of the existing approaches. In this contribution we present a high performance system that deals with this limitation. Robustness to varying illumination conditions is achieved by a novel technique that combines a gradient-based optical flow method with a non-parametric image transformation based on the Rank transform. The paper describes this method and quantitatively evaluates its robustness to different illumination changing patterns. This technique has been successfully implemented in a real-time system using reconfigurable hardware. Our contribution presents the computing architecture, including the resources consumption and the obtained performance. The final system is a real-time device capable to computing motion sequences in real-time even in conditions with significant illumination changes. The robustness of the proposed system facilitates its use in multiple potential application fields.This work has been supported by the grants DEPROVI (DPI2004-07032), DRIVSCO (IST-016276-2) and TIC2007:”Plataforma Sw-Hw para sistemas de visión 3D en tiempo real”

    FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision

    Get PDF
    Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms

    Optical Flow in a Smart Sensor Based on Hybrid Analog-Digital Architecture

    Get PDF
    The purpose of this study is to develop a motion sensor (delivering optical flow estimations) using a platform that includes the sensor itself, focal plane processing resources, and co-processing resources on a general purpose embedded processor. All this is implemented on a single device as a SoC (System-on-a-Chip). Optical flow is the 2-D projection into the camera plane of the 3-D motion information presented at the world scenario. This motion representation is widespread well-known and applied in the science community to solve a wide variety of problems. Most applications based on motion estimation require work in real-time; hence, this restriction must be taken into account. In this paper, we show an efficient approach to estimate the motion velocity vectors with an architecture based on a focal plane processor combined on-chip with a 32 bits NIOS II processor. Our approach relies on the simplification of the original optical flow model and its efficient implementation in a platform that combines an analog (focal-plane) and digital (NIOS II) processor. The system is fully functional and is organized in different stages where the early processing (focal plane) stage is mainly focus to pre-process the input image stream to reduce the computational cost in the post-processing (NIOS II) stage. We present the employed co-design techniques and analyze this novel architecture. We evaluate the system’s performance and accuracy with respect to the different proposed approaches described in the literature. We also discuss the advantages of the proposed approach as well as the degree of efficiency which can be obtained from the focal plane processing capabilities of the system. The final outcome is a low cost smart sensor for optical flow computation with real-time performance and reduced power consumption that can be used for very diverse application domains

    Three Realizations and Comparison of Hardware for Piezoresistive Tactile Sensors

    Get PDF
    Tactile sensors are basically arrays of force sensors that are intended to emulate the skin in applications such as assistive robotics. Local electronics are usually implemented to reduce errors and interference caused by long wires. Realizations based on standard microcontrollers, Programmable Systems on Chip (PSoCs) and Field Programmable Gate Arrays (FPGAs) have been proposed by the authors for the case of piezoresistive tactile sensors. The solution employing FPGAs is especially relevant since their performance is closer to that of Application Specific Integrated Circuits (ASICs) than that of the other devices. This paper presents an implementation of such an idea for a specific sensor. For the purpose of comparison, the circuitry based on the other devices is also made for the same sensor. This paper discusses the implementation issues, provides details regarding the design of the hardware based on the three devices and compares them.This work has been partially funded by the Spanish Government under contracts TEC2006-12376 and TEC2009-14446

    Locating moving objects in car-driving sequences

    Get PDF

    Three Realizations and Comparison of Hardware for Piezoresistive Tactile Sensors

    Get PDF
    Tactile sensors are basically arrays of force sensors that are intended to emulate the skin in applications such as assistive robotics. Local electronics are usually implemented to reduce errors and interference caused by long wires. Realizations based on standard microcontrollers, Programmable Systems on Chip (PSoCs) and Field Programmable Gate Arrays (FPGAs) have been proposed by the authors for the case of piezoresistive tactile sensors. The solution employing FPGAs is especially relevant since their performance is closer to that of Application Specific Integrated Circuits (ASICs) than that of the other devices. This paper presents an implementation of such an idea for a specific sensor. For the purpose of comparison, the circuitry based on the other devices is also made for the same sensor. This paper discusses the implementation issues, provides details regarding the design of the hardware based on the three devices and compares them

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    메모리 대역폭이 감소된 다중 프레임 레이트 옵티칼 플로우

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 전기·정보공학부, 2015. 2. 김수환.최근 high frame rate camera의 비약적인 발전으로 이미 4K 1000FPS camera가 출시되었고 휴대폰에서도 1080P 240FPS를 지원하고 있다. Camera의 Frame rate 증가는 optical flow의 구현에 시사하는 바가 큰데, 그 이유는 frame rate이 올라갈수록 frame 간의 움직임 크기가 줄어들기 때문이다. 그 동안 큰 움직임에 대한 부정확한 optical flow 문제를 해결하기 위해서 다양한 알고리즘이 사용되어 왔지만, 이로 인한 computation의 증가 또는 알고리즘 dependency로 인해 늘어난 연산 시간은 real-time operation에 제약으로 작용한다. 하지만 camera의 frame rate이 올라가면 모든 움직임들은 이에 반비례해서 작아지므로, 결국 high frame rate camera는 간단한 알고리즘으로 정확한 optical flow를 얻을 수 있는 길을 열고 있다. 본 논문은 accurate real-time optical flow의 구현을 위해서 multi-frame rate and multi-scale optical flow 알고리즘을 제안한다. High frame rate camera를 이용한 multi-frame rate and multi-scale optical flow 알고리즘은 real-time optical flow의 hardware 구현에 적합하도록 iterative calculation없는 알고리즘이다. Multi-frame rate 알고리즘은 다양한 frame rate의 optical flow를 연산하고 서로간의 연관관계를 이용하여 slow motion 뿐만 아니라 high motion에 관해서도 optical flow 결과를 얻게 함으로써 측정 가능한 움직임을 확장시킨 알고리즘이다. 이 알고리즘은 frame rate 증가에 따른 시스템 연산량 증가를 기존 연구의 O(n)에서 O(log n) 수준으로 감소시킴으로써 system performance에 의한 제약을 크게 줄인다. Multi scale 알고리즘은 high frame rate system을 위한 full density 지원 알고리즘이다. 또한 본 논문에서는 frame rate의 증가에 따른 external memory access bandwidth 증가 문제를 풀기 위해서 spatial & temporal bandwidth reduction 알고리즘을 제안한다. 이 방법은 기존 LK optical flow알고리즘의 연산 순서를 바꾸고, iterative sub-sampling scheme, temporal Gaussian tail cut 그리고 frame reuse 등 다양한 방식의 알고리즘들을 제안함으로써 high frame rate system의 external memory access bandwidth를 감소시킨다. 마지막으로 Multi-Frame rate and multi-scale optical flow 알고리즘의 Multi-scale 구조의 hardware 의 구현 시 multiplier의 개수를 mxm크기의 윈도우처리를 위해 m개의 multiplier를 이용해서 convolution방식으로 구현하던 기존의 방법을 윈도우의 크기에 상관없이 2개의 multiplier로 mxm multiplication을 구현하는 방식을 제안한다. 이 방식을 기반으로 multi frame rate과 multi-scale의 hardware architecture를 제안하고 single level LK optical flow의 fpga구현을 통해서 제안한 architecture의 hardware 동작을 검증한다. 이상의 과정들을 통해서 accurate real-time optical flow system을 위한 multi-frame rate and multi-scale optical flow의 알고리즘 제안부터 architecture 검증까지의 연구를 진행한다.차 례 초 록 i 차 례 iii 그림 목차 vii 표 목 차 x 제1장 서 론 1 1.1 연구 배경 1 1.2 연구 내용 4 1.3 논문 구성 6 제2장 이전연구 7 2.1 LK Optical Flow 7 2.2 Large Displacement Problem 10 2.2.1 Pyramidal LK Optical Flow 10 2.2.2 High Frame Rate Optical Flow 11 2.2.3 Oversampled Optical Flow System 11 2.2.4 High Frame Rate Optical Flow System 14 2.3 Problems in High Frame Rate System 16 2.3.1 Test Sequence for High Frame Rate System 16 2.3.2 Saturated Accuracy for High frame rate 17 2.3.3 Accurate displacement for LK optical flow 21 2.3.1 Accurate frame rate of High frame rate system 23 제3장 Multi-Frame Rate Optical Flow 28 3.1 Ideal and Real Optical Flow System 29 3.2 Multi-Frame Rate Optical Flow 31 3.3 Accurate Frame Rate Selection 33 3.3.1 Magnitude Selection Algorithm 33 3.3.2 Magnitude Algorithm Validation 36 3.3.3 SSD(Sum of Squared Difference) 45 3.3.4 Magnitude with NR Selection Algorithm 48 3.3.5 Temporal Aliasing 51 3.4 Multi Frame Rate Optical Flow Test Result 52 3.5 Comparisons with previous works 57 제4장 Multi-Scale Optical Flow 59 4.1 Pyramidal Level Selection 61 4.2 Level Selection Algorithm 62 4.3 Pyramidal Image Generation 63 4.4 Proposed Algorithm Verification 66 4.4.1 Accuracy comparison 66 4.4.2 연산량 및 알고리즘 특성 comparisons 67 4.4.3 Graphical Result with various test sequences 69 제5장 Memory Bandwidth Reduction 76 5.1 Single Level LK Optical Flow System 76 5.1.1 Bandwidth Problem 76 5.1.2 Matrix multiplication 77 5.1.3 FPGA Implementation 77 5.2 Spatial Bandwidth Reduction 79 5.2.1 LK Optical Flow System Architecture 79 5.2.2 External Memory Bandwidth Requirement 80 5.2.1 제안하는 알고리즘 83 5.2.2 Simulation Result 86 5.3 Temporal Bandwidth Reduction 93 5.3.1 Tail Cut 93 5.3.2 Frame Reuse 97 5.3.3 Experimental Result 101 5.4 Matrix Generation 107 5.4.1 Matrix Multiplication 107 5.4.2 Proposed Matrix Multiplication 108 5.5 FPGA Implementation 111 제6장 결론 117 참 고 문 헌 118 Abstract 122Docto
    corecore