2,070 research outputs found

    Video guidance, landing, and imaging systems

    Get PDF
    The adaptive potential of video guidance technology for earth orbital and interplanetary missions was explored. The application of video acquisition, pointing, tracking, and navigation technology was considered to three primary missions: planetary landing, earth resources satellite, and spacecraft rendezvous and docking. It was found that an imaging system can be mechanized to provide a spacecraft or satellite with a considerable amount of adaptability with respect to its environment. It also provides a level of autonomy essential to many future missions and enhances their data gathering ability. The feasibility of an autonomous video guidance system capable of observing a planetary surface during terminal descent and selecting the most acceptable landing site was successfully demonstrated in the laboratory. The techniques developed for acquisition, pointing, and tracking show promise for recognizing and tracking coastlines, rivers, and other constituents of interest. Routines were written and checked for rendezvous, docking, and station-keeping functions

    Optical guidance vidicon test program

    Get PDF
    A laboratory and field test program was conducted to quantify the optical navigation parameters of the Mariner vidicons. A scene simulator and a camera were designed and built for vidicon tests under a wide variety of conditions. Laboratory tests characterized error sources important to the optical navigation process and field tests verified star sensitivity and characterized comet optical guidance parameters. The equipment, tests and data reduction techniques used are described. Key test results are listed. A substantial increase in the understanding of the use of selenium vidicons as detectors for spacecraft optical guidance was achieved, indicating a reduction in residual offset errors by a factor of two to four to the single pixel level

    Optics and lasers: A compilation

    Get PDF
    A number of innovative devices and techniques in optics and related fields were presented. The following areas were covered: advances in laser and holography technology, articles on spectroscopy and general optics, new information in the area of photography

    Video Guidance, Landing, and Imaging system (VGLIS) for space missions

    Get PDF
    The feasibility of an autonomous video guidance system that is capable of observing a planetary surface during terminal descent and selecting the most acceptable landing site was demonstrated. The system was breadboarded and "flown" on a physical simulator consisting of a control panel and monitor, a dynamic simulator, and a PDP-9 computer. The breadboard VGLIS consisted of an image dissector camera and the appropriate processing logic. Results are reported

    Photoheliograph study for the Apollo telescope mount

    Get PDF
    Photoheliograph study for Apollo telescope moun

    Explorations in Monocular Distance And Ranging (MODAR) Techniques for Real-World Applications

    Get PDF
    In this work, an initial prototype of a monocular camera system capable of retrieving depth-from-focus using a liquid focus-tunable lens is constructed out of hobby-grade photography equipment. This concept has been explored previously in laboratory settings using specialized equipment; this work seeks to determine the feasibility of retrieving depth-from-focus using commercially available components. To achieve this, an iterative exploration of existing techniques was performed to verify their utility in the final ensemble of processes to retrieve depth from 2D images. Initially, blurry images were simulated by applying Gaussian blur to test images to verify the functionality of a Laplacian of Gaussian-based algorithm capable of determining image clarity, a sliding gantry was then constructed to move a camera through the environment and test the image clarity algorithm on real-world data as well as test methods to create a composite image of the most in-focus pixels from a focal stack of images collected while the camera was in motion. Following this, the depth retrieval algorithm was tested on a geared lens setup in which a gear-driven fixed focal length lens was attached to a camera and driven such that the distance between the lens and the imaging sensor in the camera was varied to change the optical power of the lens. This setup suffered from several limitations but provided significant insight into the fundamental principles governing depth-from-focus retrieval. Finally, 12mm, f/6, Liquid Lens Cx Series Fixed Focal Length Lens from Edmund Optics was attached to a Raspberry Pi Global Shutter camera to retrieve depth from an environment. This lens can vary its optical power by applying a voltage to the liquid lens which can be done automatically from a microcontroller at a high rate of speed. This operated with limited success and produced a very noisy depth map and point cloud of the environment. This work concludes with suggestions for future work to significantly improve the depth retrieval functionality of the liquid lens setup
    • …
    corecore