13 research outputs found

    Model Based Teleoperation to Eliminate Feedback Delay NSF Grant BCS89-01352 - 3rd Report

    Get PDF
    We are conducting research in the area of teleoperation with feedback delay. Significant delays occur when performing space teleoperation from the earth as well as in subsea teleoperation where the operator is typically on a surface vessel and communication is via acoustic links. These delays make teleoperation extremely difficult and lead to very low operator productivity. We have combined computer graphics with manipulator programming to provide a solution to the delay problem. A teleoperator master arm is interfaced to a graphical simulation of the remote environment. Synthetic fixtures are used to guide the operators motions and to provide kinesthetic feedback. The operator\u27s actions are monitored and used to generate symbolic motion commands for transmission to, and execution by, the remote slave robot. While much of a task proceeds error free, when an error does occur, the slave system transmits data back to the master environment where the operator can then experience the motion of the slave manipulator in actual task execution. We have also provided for the use of tools such as an impact wrench and a winch at the slave site. In all cases the tools are unencumbered by sensors; the slave uses a compliant instrumented wrist to monitor tool operation in terms of resulting motions and reaction forces

    Haptic Interface for Center of Workspace Interaction

    Get PDF
    We build upon a new interaction style for 3D interfaces, called the center of workspace interaction. This style of interaction is defined with respect to a central fixed point in 3D space, conceptually within arm\u27s length of the user. For demonstration, we show a haptically enabled fish tank VR that utilizes a set of interaction widgets to support rapid navigation within a large virtual space. The fish tank VR refers to the creation of a small but high quality virtual reality that combines a number of technologies, such as head-tracking and stereo glasses, to their mutual advantag

    Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning

    Get PDF
    We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths

    Virtual Reality Based Environment for Orthopedic Surgery (Veos)

    Get PDF
    The traditional way of teaching surgery involves students observing a �live� surgery and then gradually assisting experienced surgeons. The creation of a Virtual Reality environment for orthopedic surgery (VEOS) can be beneficial in improving the quality of training while decreasing the time needed for training. Developing such virtual environments for educational and training purposes can supplement existing approaches. In this research, the design and development of a virtual reality based environment for orthopedic surgery is described. The scope of the simulation environment is restricted to an orthopedic surgery process known as Less Invasive Stabilization System (LISS) surgery. The primary knowledge source for the LISS surgical process was Miguel A. Pirela-Cruz (Head of Orthopedic Surgery and Rehabilitation, Texas Tech University Health Sciences Center (TTHSC)). The VEOS was designed and developed on a PC based platform. The developed VEOS was validated through interactions with surgical residents at TTHSC. Feedback from residents and our collaborator Miguel A. Pirela-Cruz was used to make necessary modifications to the surgical environment.Industrial Engineering & Managemen

    \u3cem\u3eGRASP News\u3c/em\u3e, Volume 8, Number 1

    Get PDF
    A report of the General Robotics and Active Sensory Perception (GRASP) Laboratory. Edited by Thomas Lindsay

    Evaluation of Haptic and Visual Cues for Repulsive or Attractive Guidance in Nonholonomic Steering Tasks.

    Get PDF
    Remote control of vehicles is a difficult task for operators. Support systems that present additional task information may assist operators, but their usefulness is expected to depend on several factors such as 1) the nature of conveyed information, 2) what modality it is conveyed through, and 3) the task difficulty. In an exploratory experiment, these three factors were manipulated to quantify their effects on operator behavior. Subjects ( n=15n = {{15}}) used a haptic manipulator to steer a virtual nonholonomic vehicle through abstract environments, in which obstacles needed to be avoided. Both a simple support conveying near-future predictions of the trajectory of the vehicle and a more elaborate support that continuously suggests the path to be taken were designed (factor 1). These types of information were offered either with visual or haptic cues (factor 2). These four support systems were tested in four different abstracted environments with decreasing amount of allowed variability in realized trajectories (factor 3). The results show improvements for the simple support only when this information was presented visually, but not when offered haptically. For the elaborate support, equally large improvements for both modalities were found. This suggests that the elaborate support is better: additional information is key in improving performance in nonholonomic steering tasks

    Human Management of the Hierarchical System for the Control of Multiple Mobile Robots

    Get PDF
    In order to take advantage of autonomous robotic systems, and yet ensure successful completion of all feasible tasks, we propose a mediation hierarchy in which an operator can interact at all system levels. Robotic systems are not robust in handling un-modeled events. Reactive behaviors may be able to guide the robot back into a modeled state and to continue. Reasoning systems may simply fail. Once a system has failed it is difficult to re-start the task from the failed state. Rather, the rule base is revised, programs altered, and the task re-tried from the beginning

    A Posture Sequence Learning System for an Anthropomorphic Robotic Hand

    Get PDF
    The paper presents a cognitive architecture for posture learning of an anthropomorphic robotic hand. Our approach is aimed to allow the robotic system to perform complex perceptual operations, to interact with a human user and to integrate the perceptions by a cognitive representation of the scene and the observed actions. The anthropomorphic robotic hand imitates the gestures acquired by the vision system in order to learn meaningful movements, to build its knowledge by different conceptual spaces and to perform complex interaction with the human operator
    corecore