1,170 research outputs found

    Integration of advanced teleoperation technologies for control of space robots

    Get PDF
    Teleoperated robots require one or more humans to control actuators, mechanisms, and other robot equipment given feedback from onboard sensors. To accomplish this task, the human or humans require some form of control station. Desirable features of such a control station include operation by a single human, comfort, and natural human interfaces (visual, audio, motion, tactile, etc.). These interfaces should work to maximize performance of the human/robot system by streamlining the link between human brain and robot equipment. This paper describes development of a control station testbed with the characteristics described above. Initially, this testbed will be used to control two teleoperated robots. Features of the robots include anthropomorphic mechanisms, slaving to the testbed, and delivery of sensory feedback to the testbed. The testbed will make use of technologies such as helmet mounted displays, voice recognition, and exoskeleton masters. It will allow tor integration and testing of emerging telepresence technologies along with techniques for coping with control link time delays. Systems developed from this testbed could be applied to ground control of space based robots. During man-tended operations, the Space Station Freedom may benefit from ground control of IVA or EVA robots with science or maintenance tasks. Planetary exploration may also find advanced teleoperation systems to be very useful

    User-centered design of a dynamic-autonomy remote interaction concept for manipulation-capable robots to assist elderly people in the home

    Get PDF
    In this article, we describe the development of a human-robot interaction concept for service robots to assist elderly people in the home with physical tasks. Our approach is based on the insight that robots are not yet able to handle all tasks autonomously with sufficient reliability in the complex and heterogeneous environments of private homes. We therefore employ remote human operators to assist on tasks a robot cannot handle completely autonomously. Our development methodology was user-centric and iterative, with six user studies carried out at various stages involving a total of 241 participants. The concept is under implementation on the Care-O-bot 3 robotic platform. The main contributions of this article are (1) the results of a survey in form of a ranking of the demands of elderly people and informal caregivers for a range of 25 robot services, (2) the results of an ethnography investigating the suitability of emergency teleassistance and telemedical centers for incorporating robotic teleassistance, and (3) a user-validated human-robot interaction concept with three user roles and corresponding three user interfaces designed as a solution to the problem of engineering reliable service robots for home environments

    Graphical programming and the use of simulation for space-based manipulators

    Get PDF
    Robotic manipulators are difficult to program even without the special requirements of a zero-gravity environment. While attention should be paid to investigating the usefulness of industrial application programming methods to space manipulators, new methods with potential application to both environments need to be invented. These methods should allow various levels of autonomy and human-in-the-loop interaction and simple, rapid switching among them. For all methods simulation must be integrated to provide reliability and safety. Graphical programming of manipulators is a candidate for an effective robot programming method despite current limitations in input devices and displays. A research project in task-level robot programming has built an innovative interface to a state-of-the-art commercial simulation and robot programming platform. The prototype demonstrates simple augmented methods for graphical programming and simulation which may be of particular interest to those concerned with Space Station applications; its development has also raised important issues for the development of more sophisticated robot programming tools. Both aspects of the project are discussed

    Advancing automation and robotics technology for the space station and for the US economy: Submitted to the United States Congress October 1, 1987

    Get PDF
    In April 1985, as required by Public Law 98-371, the NASA Advanced Technology Advisory Committee (ATAC) reported to Congress the results of its studies on advanced automation and robotics technology for use on the space station. This material was documented in the initial report (NASA Technical Memorandum 87566). A further requirement of the Law was that ATAC follow NASA's progress in this area and report to Congress semiannually. This report is the fifth in a series of progress updates and covers the period between 16 May 1987 and 30 September 1987. NASA has accepted the basic recommendations of ATAC for its space station efforts. ATAC and NASA agree that the mandate of Congress is that an advanced automation and robotics technology be built to support an evolutionary space station program and serve as a highly visible stimulator affecting the long-term U.S. economy

    From 2D to 3D Mixed Reality Human-Robot Interface in Hazardous Robotic Interventions with the Use of Redundant Mobile Manipulator

    Get PDF
    Part de la conferència: ICINCO 2021: 18th International Conference on Informatics in Control, Automation and Robotics (juliol 2021)3D Mixed Reality (MR) Human-Robot Interfaces (HRI) show promise for robotic operators to complete tasks more quickly, safely and with less training. The objective of this study is to assess the use of 3D MR HRI environment in comparison with a standard 2D Graphical User Interface (GUI) in order to control a redundant mobile manipulator. The experimental data was taken during operation with a 9 DOF manipulator mounted in a robotized train, CERN Train Inspection Monorail (TIM), used for the Beam Loss Monitor robotic measurement task in a complex hazardous intervention scenario at CERN. The efficiency and workload of an operator were compared with the use of both types of interfaces with NASA TLX method. The usage of heart rate and Galvanic Skin Response parameters for operator condition and stress monitoring was tested. The results show that teleoperation with 3D MR HRI mitigates cognitive fatigue and stress by improving the operators understanding of both the robot’s pose and the surr ounding environment or scene

    NASA space station automation: AI-based technology review

    Get PDF
    Research and Development projects in automation for the Space Station are discussed. Artificial Intelligence (AI) based automation technologies are planned to enhance crew safety through reduced need for EVA, increase crew productivity through the reduction of routine operations, increase space station autonomy, and augment space station capability through the use of teleoperation and robotics. AI technology will also be developed for the servicing of satellites at the Space Station, system monitoring and diagnosis, space manufacturing, and the assembly of large space structures

    SPATIAL PERCEPTION AND ROBOT OPERATION: THE RELATIONSHIP BETWEEN VISUAL SPATIAL ABILITY AND PERFORMANCE UNDER DIRECT LINE OF SIGHT AND TELEOPERATION

    Get PDF
    This dissertation investigated the relationship between the spatial perception abilities of operators and robot operation under direct-line-of-sight and teleoperation viewing conditions. This study was an effort to determine if spatial ability testing may be a useful tool in the selection of human-robot interaction (HRI) operators. Participants completed eight cognitive ability measures and operated one of four types of robots under tasks of low and high difficulty. Performance for each participant was tested during both direct-line-of-sight and teleoperation. These results provide additional evidence that spatial perception abilities are reliable predictors of direct-line-of-sight and teleoperation performance. Participants in this study with higher spatial abilities performed faster, with fewer errors, and less variability. In addition, participants with higher spatial abilities were more successful in the accumulation of points. Applications of these findings are discussed in terms of teleoperator selection tools and HRI training and design recommendations with a human-centered design approach

    Increasing Transparency and Presence of Teleoperation Systems Through Human-Centered Design

    Get PDF
    Teleoperation allows a human to control a robot to perform dexterous tasks in remote, dangerous, or unreachable environments. A perfect teleoperation system would enable the operator to complete such tasks at least as easily as if he or she was to complete them by hand. This ideal teleoperator must be perceptually transparent, meaning that the interface appears to be nearly nonexistent to the operator, allowing him or her to focus solely on the task environment, rather than on the teleoperation system itself. Furthermore, the ideal teleoperation system must give the operator a high sense of presence, meaning that the operator feels as though he or she is physically immersed in the remote task environment. This dissertation seeks to improve the transparency and presence of robot-arm-based teleoperation systems through a human-centered design approach, specifically by leveraging scientific knowledge about the human motor and sensory systems. First, this dissertation aims to improve the forward (efferent) teleoperation control channel, which carries information from the human operator to the robot. The traditional method of calculating the desired position of the robot\u27s hand simply scales the measured position of the human\u27s hand. This commonly used motion mapping erroneously assumes that the human\u27s produced motion identically matches his or her intended movement. Given that humans make systematic directional errors when moving the hand under conditions similar to those imposed by teleoperation, I propose a new paradigm of data-driven human-robot motion mappings for teleoperation. The mappings are determined by having the human operator mimic the target robot as it autonomously moves its arm through a variety of trajectories in the horizontal plane. Three data-driven motion mapping models are described and evaluated for their ability to correct for the systematic motion errors made in the mimicking task. Individually-fit and population-fit versions of the most promising motion mapping model are then tested in a teleoperation system that allows the operator to control a virtual robot. Results of a user study involving nine subjects indicate that the newly developed motion mapping model significantly increases the transparency of the teleoperation system. Second, this dissertation seeks to improve the feedback (afferent) teleoperation control channel, which carries information from the robot to the human operator. We aim to improve a teleoperation system a teleoperation system by providing the operator with multiple novel modalities of haptic (touch-based) feedback. We describe the design and control of a wearable haptic device that provides kinesthetic grip-force feedback through a geared DC motor and tactile fingertip-contact-and-pressure and high-frequency acceleration feedback through a pair of voice-coil actuators mounted at the tips of the thumb and index finger. Each included haptic feedback modality is known to be fundamental to direct task completion and can be implemented without great cost or complexity. A user study involving thirty subjects investigated how these three modalities of haptic feedback affect an operator\u27s ability to control a real remote robot in a teleoperated pick-and-place task. This study\u27s results strongly support the utility of grip-force and high-frequency acceleration feedback in teleoperation systems and show more mixed effects of fingertip-contact-and-pressure feedback
    • …
    corecore