5 research outputs found

    A parallel implementation of a multisensor feature-based range-estimation method

    Get PDF
    There are many proposed vision based methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. All methods, however, will require very high processing rates to achieve real time performance. A system capable of supporting autonomous helicopter navigation will need to extract obstacle information from imagery at rates varying from ten frames per second to thirty or more frames per second depending on the vehicle speed. Such a system will need to sustain billions of operations per second. To reach such high processing rates using current technology, a parallel implementation of the obstacle detection/ranging method is required. This paper describes an efficient and flexible parallel implementation of a multisensor feature-based range-estimation algorithm, targeted for helicopter flight, realized on both a distributed-memory and shared-memory parallel computer

    Multiple-camera/motion stereoscopy for range estimation in helicopter flight

    Get PDF
    Aiding the pilot to improve safety and reduce pilot workload by detecting obstacles and planning obstacle-free flight paths during low-altitude helicopter flight is desirable. Computer vision techniques provide an attractive method of obstacle detection and range estimation for objects within a large field of view ahead of the helicopter. Previous research has had considerable success by using an image sequence from a single moving camera to solving this problem. The major limitations of single camera approaches are that no range information can be obtained near the instantaneous direction of motion or in the absence of motion. These limitations can be overcome through the use of multiple cameras. This paper presents a hybrid motion/stereo algorithm which allows range refinement through recursive range estimation while avoiding loss of range information in the direction of travel. A feature-based approach is used to track objects between image frames. An extended Kalman filter combines knowledge of the camera motion and measurements of a feature's image location to recursively estimate the feature's range and to predict its location in future images. Performance of the algorithm will be illustrated using an image sequence, motion information, and independent range measurements from a low-altitude helicopter flight experiment

    Real-Time Computational Needs of a Multisensor Feature-Based Range-Estimation Method

    No full text
    The computer vision literature describes many methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. Methods may be broadly categorized into eld-based techniques and feature-based techniques. Field-based techniques have the advantage of regular computational structure at every pixel throughout the image plane. Feature-based techniques are much more data driven in that computational complexity increases dramatically in regions of the image populated by features. It is widely believed that to run computer vision algorithms in real time a parallel architecture is necessary. Field-based techniques lend themselves to easy parallelization due to their regular computational needs. However, we have found that eld-based methods are sensitive to noise and have traditionally been di cult to generalize to arbitrary vehicle motion. Therefore, we have sought techniques to parallelize feature-based methods. This paper describes the computational needs of a parallel feature-based range-estimation method developed by NASA Ames. Issues of processing-element performance, load balancing, and data- ow bandwidth are addressed along with a performance review of two architectures on which the feature-based method has been implemented.

    Hydrogen and Biofuel Production in the Chloroplast

    No full text
    International audienc
    corecore