14 research outputs found

    Laser Bandwidth-Induced Fluctuations in the Intensity Transmitted by a Fabry-Perot Interferometer

    Get PDF
    We have measured the power spectrum of the intensity Auctuations of light transmitted by a Fabry- Perot interferometer when the input field is the real Gaussian field. The real Gaussian field is a field characterized by real, random (Gaussian) amplitude Auctuations. The bandwidth of the real Gaussian field was varied, taking on values less than that of the interferometer, as well as greater. Comparisons of the measured spectra with calculated spectra are quite satisfactory. Of special interest is a feature in the spectra centered at the laser-interferometer detuning frequency

    HDL and Software Sources for Bio-Inspired Visual Collision Avoidance on the Hexapod Robot HECTOR

    No full text
    Meyer HG, Klimeck D. HDL and Software Sources for Bio-Inspired Visual Collision Avoidance on the Hexapod Robot HECTOR. Bielefeld University; 2020.# HDL and software sources for bio-inspired visual collision avoidance on the hexapod walking robot HECTOR CITEC - Center of Excellence Cognitive Interaction Technology, Bielefeld University, 2020 __Developers:__ * Daniel Klimeck - [email protected] * Hanno Gerd Meyer - [email protected] __Description:__ The repository contains the VHDL-based cores realizing bio-inspired visual processing on a Xilinx-based Zynq-7000 SoC as well as the complementary software sources to enable the hexapod walking robot HECTOR to perform bio-inspired visual collision avoidance. The vision-based direction controller used is based upon: [1] Bertrand et al. (2015) A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes PLoS Comput Biol. 2015; 11(11):e1004339 doi: 10.1371/journal.pcbi.1004339 [2] Meyer et al. (2016) A Bio-Inspired Model for Visual Collision Avoidance on a Hexapod Walking Robot. In: Biomimetic and Biohybrid Systems: 5th International Conference, Living Machines 2016, Edinburgh, UK, July 19-22, 2016. Proceedings; 2016. p. 167--178. doi: 10.1007/978-3-319-42417-0_16 [3] Klimeck et al. (2018) Resource-efficient Reconfigurable Computer-on-Module for Embedded Vision Applications. In: 2018 IEEE 29th International Conference on Application-specific Systems, Architectures and Processors (ASAP); 2018. p. 1--4. doi: 10.1109/ASAP.2018.8445091 The interfaces for the image data transmission between the VHDL-based cores are based on the AXI4-Stream protocol specification. Xilinx-based cores that are used for realizing the processing within the Zynq device are marked in the VHDL code. The sources of the Xilinx-based cores are not included within this repository. The processing pipeline for the resource-efficient insect-inspired visual processing within the FPGA looks like the following: ReMap - SA - HPF - LPF - EMD - ME - ANV After processing of the camera images by the Zynq hardware, the Average Nearness Vector (ANV) is used to control the walking direction of the hexapod walking robot HECTOR. In the experimental setup HECTOR obtains its absolute position and orientation within the arena using a system for tracking visual markers. Based on the direction to a goal location location and the ANV the walking direction is computed. See [1-3] for further details. The content of this repository is structured as follows: ``` - VHDL -- ANV (Average Nearness Vector) --- anv.vhd -- EMD (Elementary Motion Detector) --- emd.vhd -- HPF (High Pass Filter) --- AXI4-Lite.vhd --- hpf.vhd -- LPF (Low Pass Filter) --- AXI4-Lite.vhd --- lpf.vhd -- ME (Motion Energy) --- me.vhd -- ReMap (Remapping and Scaling) --- mem_init_files ---- ORDERout_bin_ROM.coe ---- ORDERx_bin_ROM.M#coe ---- ORDERydiff_bin_ROM.coe --- AXI4-Lite.vhd --- remap.vhd -- SA (Sensitivity Adaption) --- sa.vhd - python -- __init.py__ -- auto_visionmodule_twb.ini (Configuration file) -- auto_visionmodule_twb.py (Main script) -- behavior (Computation of heading direction) --- __init.py__ --- CollisionAvoidance.py -- camera (Communication with Zynq hardware) --- __init.py__ --- vision_module ---- __init.py__ ---- VisionModuleClient.py -- control (Control of HECTOR's walking direction) --- __init.py__ --- Control.py -- joystick (Manual control of HECTOR's walking direction) --- __init.py__ --- client ---- __init.py__ ---- JoystickClient.py --- server ---- JoystickServer.py --- standalone ---- __init.py__ ---- JoystickStandalone.py -- logging (Logging of runtime data) --- __init.py__ --- logclient_demo.py --- client ---- __init.py__ ---- LogClient.py --- server ----__init.py__ ---- LogServer.py -- twb (Interface to the marker tracking of the teleworkbench) --- __init.py__ --- bridge_client ---- __init.py__ ---- TWBBridgeClient.py --- twb_bridge ---- (...) -- visualization (Visualization of the processed camera images and walking directions) --- __init.py__ --- client ---- __init.py__ ---- VisualizationClient.py --- server ---- __init.py__ ____ VisualizationServer.py ``

    Generation and Intensity Correlation Measurements of the Real Gaussian Field

    No full text
    The generation of a broadband laser field with well-defined and controllable statistical properties, known as the real Gaussian laser field, has been achieved through the random modulation of the amplitude of a stabilized laser beam. The verification of the field-generation technique is provided by measurements ofthe laser power spectrum (Lorentzian) and by measurements ofthe intensity au- tocorrelation function. The latter is shown to decrease exponentially from an initial value of nearly 3 to a final value of 1 with a decay time related to the inverse bandwidth of the field. The techniques for generating this field and for its characterization are discussed in this paper

    FPGA-based Generic Architecture for Rapid Prototyping of Video Hardware Accelerators using NoC AXI4-Stream Interconnect and GigE Vision Camera Interfaces

    No full text
    Irwansyah A, Ibraheem OW, Klimeck D, Porrmann M, Rückert U. FPGA-based Generic Architecture for Rapid Prototyping of Video Hardware Accelerators using NoC AXI4-Stream Interconnect and GigE Vision Camera Interfaces. Presented at the Bildverarbeitung in der Automation (BVAu) 2014, Lemgo, Germany

    Resource-efficient Reconfigurable Computer-on-Module for Embedded Vision Applications

    No full text
    Klimeck D, Meyer HG, Hagemeyer J, Porrmann M, Rückert U. Resource-efficient Reconfigurable Computer-on-Module for Embedded Vision Applications. In: 2018 IEEE 29th International Conference on Application-specific Systems, Architectures and Processors (ASAP). Piscataway, NJ: IEEE; 2018

    Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR.

    No full text
    Meyer HG, Klimeck D, Paskarbeit J, et al. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PloS one. 2020;15(4).Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones
    corecore