15,208 research outputs found

    Multiple-Agent Air/Ground Autonomous Exploration Systems

    Get PDF
    Autonomous systems of multiple-agent air/ground robotic units for exploration of the surfaces of remote planets are undergoing development. Modified versions of these systems could be used on Earth to perform tasks in environments dangerous or inaccessible to humans: examples of tasks could include scientific exploration of remote regions of Antarctica, removal of land mines, cleanup of hazardous chemicals, and military reconnaissance. A basic system according to this concept (see figure) would include a unit, suspended by a balloon or a blimp, that would be in radio communication with multiple robotic ground vehicles (rovers) equipped with video cameras and possibly other sensors for scientific exploration. The airborne unit would be free-floating, controlled by thrusters, or tethered either to one of the rovers or to a stationary object in or on the ground. Each rover would contain a semi-autonomous control system for maneuvering and would function under the supervision of a control system in the airborne unit. The rover maneuvering control system would utilize imagery from the onboard camera to navigate around obstacles. Avoidance of obstacles would also be aided by readout from an onboard (e.g., ultrasonic) sensor. Together, the rover and airborne control systems would constitute an overarching closed-loop control system to coordinate scientific exploration by the rovers

    Human-Machine Interface for Remote Training of Robot Tasks

    Full text link
    Regardless of their industrial or research application, the streamlining of robot operations is limited by the proximity of experienced users to the actual hardware. Be it massive open online robotics courses, crowd-sourcing of robot task training, or remote research on massive robot farms for machine learning, the need to create an apt remote Human-Machine Interface is quite prevalent. The paper at hand proposes a novel solution to the programming/training of remote robots employing an intuitive and accurate user-interface which offers all the benefits of working with real robots without imposing delays and inefficiency. The system includes: a vision-based 3D hand detection and gesture recognition subsystem, a simulated digital twin of a robot as visual feedback, and the "remote" robot learning/executing trajectories using dynamic motion primitives. Our results indicate that the system is a promising solution to the problem of remote training of robot tasks.Comment: Accepted in IEEE International Conference on Imaging Systems and Techniques - IST201

    Automated Global Feature Analyzer - A Driver for Tier-Scalable Reconnaissance

    Get PDF
    For the purposes of space flight, reconnaissance field geologists have trained to become astronauts. However, the initial forays to Mars and other planetary bodies have been done by purely robotic craft. Therefore, training and equipping a robotic craft with the sensory and cognitive capabilities of a field geologist to form a science craft is a necessary prerequisite. Numerous steps are necessary in order for a science craft to be able to map, analyze, and characterize a geologic field site, as well as effectively formulate working hypotheses. We report on the continued development of the integrated software system AGFA: automated global feature analyzerreg, originated by Fink at Caltech and his collaborators in 2001. AGFA is an automatic and feature-driven target characterization system that operates in an imaged operational area, such as a geologic field site on a remote planetary surface. AGFA performs automated target identification and detection through segmentation, providing for feature extraction, classification, and prioritization within mapped or imaged operational areas at different length scales and resolutions, depending on the vantage point (e.g., spaceborne, airborne, or ground). AGFA extracts features such as target size, color, albedo, vesicularity, and angularity. Based on the extracted features, AGFA summarizes the mapped operational area numerically and flags targets of "interest", i.e., targets that exhibit sufficient anomaly within the feature space. AGFA enables automated science analysis aboard robotic spacecraft, and, embedded in tier-scalable reconnaissance mission architectures, is a driver of future intelligent and autonomous robotic planetary exploration

    JSC flight experiment recommendation in support of Space Station robotic operations

    Get PDF
    The man-tended configuration (MTC) of Space Station Freedom (SSF) provides a unique opportunity to move robotic systems from the laboratory into the mainstream space program. Restricted crew access due to the Shuttle's flight rate, as well as constrained on-orbit stay time, reduces the productivity of a facility dependent on astronauts to perform useful work. A natural tendency toward robotics to perform maintenance and routine tasks will be seen in efforts to increase SSF usefulness. This tendency will provide the foothold for deploying space robots. This paper outlines a flight experiment that will capitalize on the investment in robotic technology made by NASA over the past ten years. The flight experiment described herein provides the technology demonstration necessary for taking advantage of the expected opportunity at MTC. As a context to this flight experiment, a broader view of the strategy developed at the JSC is required. The JSC is building toward MTC by developing a ground-based SSF emulation funded jointly by internal funds, NASA/Code R, and NASA/Code M. The purpose of this ground-based Station is to provide a platform whereby technology originally developed at JPL, LaRC, and GSFC can be integrated into a near flight-like condition. For instance, the Automated Robotic Maintenance of Space Station (ARMSS) project integrates flat targets, surface inspection, and other JPL technologies into a Station analogy for evaluation. Also, ARMSS provides the experimental platform for the Capaciflector from GSPC to be evaluated for its usefulness in performing ORU change out or other tasks where proximity detection is required. The use and enhancement of these ground-based SSF models are planned for use through FY-93. The experimental data gathered from tests in these facilities will provide the basis for the technology content of the proposed flight experiment
    corecore