10 research outputs found

    Global Swath and Gridded Data Tiling

    Get PDF
    This software generates cylindrically projected tiles of swath-based or gridded satellite data for the purpose of dynamically generating high-resolution global images covering various time periods, scaling ranges, and colors called "tiles." It reconstructs a global image given a set of tiles covering a particular time range, scaling values, and a color table. The program is configurable in terms of tile size, spatial resolution, format of input data, location of input data (local or distributed), number of processes run in parallel, and data conditioning

    GOES-R: Satellite Insight

    Get PDF
    GOES-R: Satellite Insight seeks to bring awareness of the GOES-R (Geostationary Operational Environmental Satellite -- R Series) satellite currently in development to an audience of all ages on the emerging medium of mobile games. The iPhone app (Satellite Insight) was created for the GOES-R Program. The app describes in simple terms the types of data products that can be produced from GOES-R measurements. The game is easy to learn, yet challenging for all audiences. It includes educational content and a path to further information about GOESR, its technology, and the benefits of the data it collects. The game features action-puzzle game play in which the player must prevent an overflow of data by matching falling blocks that represent different types of GOES-R data. The game adds more different types of data blocks over time, as long as the player can prevent a data overflow condition. Points are awarded for matches, and players can compete with themselves to beat their highest score

    Stereo and IMU-Assisted Visual Odometry for Small Robots

    Get PDF
    This software performs two functions: (1) taking stereo image pairs as input, it computes stereo disparity maps from them by cross-correlation to achieve 3D (three-dimensional) perception; (2) taking a sequence of stereo image pairs as input, it tracks features in the image sequence to estimate the motion of the cameras between successive image pairs. A real-time stereo vision system with IMU (inertial measurement unit)-assisted visual odometry was implemented on a single 750 MHz/520 MHz OMAP3530 SoC (system on chip) from TI (Texas Instruments). Frame rates of 46 fps (frames per second) were achieved at QVGA (Quarter Video Graphics Array i.e. 320 240), or 8 fps at VGA (Video Graphics Array 640 480) resolutions, while simultaneously tracking up to 200 features, taking full advantage of the OMAP3530's integer DSP (digital signal processor) and floating point ARM processors. This is a substantial advancement over previous work as the stereo implementation produces 146 Mde/s (millions of disparities evaluated per second) in 2.5W, yielding a stereo energy efficiency of 58.8 Mde/J, which is 3.75 better than prior DSP stereo while providing more functionality

    Aquarius iPhone Application

    Get PDF
    The Office of the CIO at JPL has developed an iPhone application for the Aquarius/SAC-D mission. The application includes specific information about the science and purpose of the Aquarius satellite and also features daily mission news updates pulled from sources at Goddard Space Flight Center as well as Twitter. The application includes a media and data tab section. The media section displays images from the observatory, viewing construction up to the launch and also includes various videos and recorded diaries from the Aquarius Project Manager. The data tab highlights many of the factors that affect the Earth s ocean and the water cycle. The application leverages the iPhone s accelerometer to move the Aquarius Satellite over the Earth, revealing these factors. Lastly, this application features a countdown timer to the satellite s launch, which is currently counting the days since launch. This application was highly successful in promoting the Aquarius Mission and educating the public about how ocean salinity is paramount to understanding the Earth

    Stereo vision-based obstacle avoidance for micro air vehicles using an egocylindrical image space representation

    Get PDF
    Micro air vehicles which operate autonomously at low altitude in cluttered environments require a method for onboard obstacle avoidance for safe operation. Previous methods deploy either purely reactive approaches, mapping low-level visual features directly to actuator inputs to maneuver the vehicle around the obstacle, or deliberative methods that use on-board 3-D sensors to create a 3-D, voxel-based world model, which is then used to generate collision free 3-D trajectories. In this paper, we use forward-looking stereo vision with a large horizontal and vertical field of view and project range from stereo into a novel robot-centered, cylindrical, inverse range map we call an egocylinder. With this implementation we reduce the complexity of our world representation from a 3D map to a 2.5D image-space representation, which supports very efficient motion planning and collision checking, and allows to implement configuration space expansion as an image processing function directly on the egocylinder. Deploying a fast reactive motion planner directly on the configuration space expanded egocylinder image, we demonstrate the effectiveness of this new approach experimentally in an indoor environment

    Stereo and IMU assisted visual odometry on an OMAP3530 for small robots

    No full text

    Actas de las XXXIV Jornadas de Autom谩tica

    Get PDF
    Postprint (published version
    corecore