1,711 research outputs found

    Design and management of image processing pipelines within CPS: Acquired experience towards the end of the FitOptiVis ECSEL Project

    Get PDF
    Cyber-Physical Systems (CPSs) are dynamic and reactive systems interacting with processes, environment and, sometimes, humans. They are often distributed with sensors and actuators, characterized for being smart, adaptive, predictive and react in real-time. Indeed, image- and video-processing pipelines are a prime source for environmental information for systems allowing them to take better decisions according to what they see. Therefore, in FitOptiVis, we are developing novel methods and tools to integrate complex image- and video-processing pipelines. FitOptiVis aims to deliver a reference architecture for describing and optimizing quality and resource management for imaging and video pipelines in CPSs both at design- and run-time. The architecture is concretized in low-power, high-performance, smart components, and in methods and tools for combined design-time and run-time multi-objective optimization and adaptation within system and environment constraints

    A high speed Tri-Vision system for automotive applications

    Get PDF
    Purpose: Cameras are excellent ways of non-invasively monitoring the interior and exterior of vehicles. In particular, high speed stereovision and multivision systems are important for transport applications such as driver eye tracking or collision avoidance. This paper addresses the synchronisation problem which arises when multivision camera systems are used to capture the high speed motion common in such applications. Methods: An experimental, high-speed tri-vision camera system intended for real-time driver eye-blink and saccade measurement was designed, developed, implemented and tested using prototype, ultra-high dynamic range, automotive-grade image sensors specifically developed by E2V (formerly Atmel) Grenoble SA as part of the European FP6 project – sensation (advanced sensor development for attention stress, vigilance and sleep/wakefulness monitoring). Results : The developed system can sustain frame rates of 59.8 Hz at the full stereovision resolution of 1280 × 480 but this can reach 750 Hz when a 10 k pixel Region of Interest (ROI) is used, with a maximum global shutter speed of 1/48000 s and a shutter efficiency of 99.7%. The data can be reliably transmitted uncompressed over standard copper Camera-Link® cables over 5 metres. The synchronisation error between the left and right stereo images is less than 100 ps and this has been verified both electrically and optically. Synchronisation is automatically established at boot-up and maintained during resolution changes. A third camera in the set can be configured independently. The dynamic range of the 10bit sensors exceeds 123 dB with a spectral sensitivity extending well into the infra-red range. Conclusion: The system was subjected to a comprehensive testing protocol, which confirms that the salient requirements for the driver monitoring application are adequately met and in some respects, exceeded. The synchronisation technique presented may also benefit several other automotive stereovision applications including near and far-field obstacle detection and collision avoidance, road condition monitoring and others.Partially funded by the EU FP6 through the IST-507231 SENSATION project.peer-reviewe

    Advanced photonic and electronic systems - WILGA 2017

    Get PDF
    WILGA annual symposium on advanced photonic and electronic systems has been organized by young scientist for young scientists since two decades. It traditionally gathers more than 350 young researchers and their tutors. Ph.D students and graduates present their recent achievements during well attended oral sessions. Wilga is a very good digest of Ph.D. works carried out at technical universities in electronics and photonics, as well as information sciences throughout Poland and some neighboring countries. Publishing patronage over Wilga keep Elektronika technical journal by SEP, IJET by PAN and Proceedings of SPIE. The latter world editorial series publishes annually more than 200 papers from Wilga. Wilga 2017 was the XL edition of this meeting. The following topical tracks were distinguished: photonics, electronics, information technologies and system research. The article is a digest of some chosen works presented during Wilga 2017 symposium. WILGA 2017 works were published in Proc. SPIE vol.10445

    DESIGN OF A MACHINE VISION CAMERA FOR SPATIAL AUGMENTED REALITY

    Get PDF
    Structured Light Imaging (SLI) is a means of digital reconstruction, or Three-Dimensional (3D) scanning, and has uses that span many disciplines. A projector, camera and Personal Computer (PC) are required to perform such 3D scans. Slight variances in synchronization between these three devices can cause malfunctions in the process due to the limitations of PC graphics processors as real-time systems. Previous work used a Field Programmable Gate Array (FPGA) to both drive the projector and trigger the camera, eliminating these timing issues, but still needing an external camera. This thesis proposes the incorporation of the camera with the FPGA SLI controller by means of a custom printed circuit board (PCB) design. Featuring a high speed image sensor as well as High Definition Multimedia Interface (HDMI) input and output, this PCB enables the FPGA to perform SLI scans as well as pass through HDMI video to the projector for Spatial Augmented Reality (SAR) purposes. Minimizing ripple noise on the power supply by means of effective circuit design and PCB layout, realizes a compact and cost effective machine vision sensing solution

    XR-RF Imaging Enabled by Software-Defined Metasurfaces and Machine Learning: Foundational Vision, Technologies and Challenges

    Full text link
    We present a new approach to Extended Reality (XR), denoted as iCOPYWAVES, which seeks to offer naturally low-latency operation and cost-effectiveness, overcoming the critical scalability issues faced by existing solutions. iCOPYWAVES is enabled by emerging PWEs, a recently proposed technology in wireless communications. Empowered by intelligent (meta)surfaces, PWEs transform the wave propagation phenomenon into a software-defined process. We leverage PWEs to i) create, and then ii) selectively copy the scattered RF wavefront of an object from one location in space to another, where a machine learning module, accelerated by FPGAs, translates it to visual input for an XR headset using PWEdriven, RF imaging principles (XR-RF). This makes for an XR system whose operation is bounded in the physical layer and, hence, has the prospects for minimal end-to-end latency. Over large distances, RF-to-fiber/fiber-to-RF is employed to provide intermediate connectivity. The paper provides a tutorial on the iCOPYWAVES system architecture and workflow. A proof-of-concept implementation via simulations is provided, demonstrating the reconstruction of challenging objects in iCOPYWAVES produced computer graphics

    Design and management of image processing pipelines within CPS : Acquired experience towards the end of the FitOptiVis ECSEL Project

    Get PDF
    Cyber-Physical Systems (CPSs) are dynamic and reactive systems interacting with processes, environment and, sometimes, humans. They are often distributed with sensors and actuators, characterized for being smart, adaptive, predictive and react in real-time. Indeed, image- and video-processing pipelines are a prime source for environmental information for systems allowing them to take better decisions according to what they see. Therefore, in FitOptiVis, we are developing novel methods and tools to integrate complex image- and video-processing pipelines. FitOptiVis aims to deliver a reference architecture for describing and optimizing quality and resource management for imaging and video pipelines in CPSs both at design- and run-time. The architecture is concretized in low-power, high-performance, smart components, and in methods and tools for combined design-time and run-time multi-objective optimization and adaptation within system and environment constraints.Peer reviewe

    Performance and energy-efficient implementation of a smart city application on FPGAs

    Get PDF
    The continuous growth of modern cities and the request for better quality of life, coupled with the increased availability of computing resources, lead to an increased attention to smart city services. Smart cities promise to deliver a better life to their inhabitants while simultaneously reducing resource requirements and pollution. They are thus perceived as a key enabler to sustainable growth. Out of many other issues, one of the major concerns for most cities in the world is traffic, which leads to a huge waste of time and energy, and to increased pollution. To optimize traffic in cities, one of the first steps is to get accurate information in real time about the traffic flows in the city. This can be achieved through the application of automated video analytics to the video streams provided by a set of cameras distributed throughout the city. Image sequence processing can be performed both peripherally and centrally. In this paper, we argue that, since centralized processing has several advantages in terms of availability, maintainability and cost, it is a very promising strategy to enable effective traffic management even in large cities. However, the computational costs are enormous, and thus require an energy-efficient High-Performance Computing approach. Field Programmable Gate Arrays (FPGAs) provide comparable computational resources to CPUs and GPUs, yet require much lower amounts of energy per operation (around 6 Ă— and 10 Ă— for the application considered in this case study). They are thus preferred resources to reduce both energy supply and cooling costs in the huge datacenters that will be needed by Smart Cities. In this paper, we describe efficient implementations of high-performance algorithms that can process traffic camera image sequences to provide traffic flow information in real-time at a low energy and power cost
    • …
    corecore