3,161 research outputs found

    Haptic Interface for Center of Workspace Interaction

    Get PDF
    We build upon a new interaction style for 3D interfaces, called the center of workspace interaction. This style of interaction is defined with respect to a central fixed point in 3D space, conceptually within arm\u27s length of the user. For demonstration, we show a haptically enabled fish tank VR that utilizes a set of interaction widgets to support rapid navigation within a large virtual space. The fish tank VR refers to the creation of a small but high quality virtual reality that combines a number of technologies, such as head-tracking and stereo glasses, to their mutual advantag

    Haptic-GeoZui3D: Exploring the Use of Haptics in AUV Path Planning

    Get PDF
    We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths

    A Framework to Illustrate Kinematic Behavior of Mechanisms by Haptic Feedback

    Get PDF
    The kinematic properties of mechanisms are well known by the researchers and teachers. The theory based on the study of Jacobian matrices allows us to explain, for example, the singular configuration. However, in many cases, the physical sense of such properties is difficult to explain to students. The aim of this article is to use haptic feedback to render to the user the signification of different kinematic indices. The framework uses a Phantom Omni and a serial and parallel mechanism with two degrees of freedom. The end-effector of both mechanisms can be moved either by classical mouse, or Phantom Omni with or without feedback

    A haptic-enabled multimodal interface for the planning of hip arthroplasty

    Get PDF
    Multimodal environments help fuse a diverse range of sensory modalities, which is particularly important when integrating the complex data involved in surgical preoperative planning. The authors apply a multimodal interface for preoperative planning of hip arthroplasty with a user interface that integrates immersive stereo displays and haptic modalities. This article overviews this multimodal application framework and discusses the benefits of incorporating the haptic modality in this area

    Design and Development of an Affordable Haptic Robot with Force-Feedback and Compliant Actuation to Improve Therapy for Patients with Severe Hemiparesis

    Get PDF
    The study describes the design and development of a single degree-of-freedom haptic robot, Haptic Theradrive, for post-stroke arm rehabilitation for in-home and clinical use. The robot overcomes many of the weaknesses of its predecessor, the TheraDrive system, that used a Logitech steering wheel as the haptic interface for rehabilitation. Although the original TheraDrive system showed success in a pilot study, its wheel was not able to withstand the rigors of use. A new haptic robot was developed that functions as a drop-in replacement for the Logitech wheel. The new robot can apply larger forces in interacting with the patient, thereby extending the functionality of the system to accommodate low-functioning patients. A new software suite offers appreciably more options for tailored and tuned rehabilitation therapies. In addition to describing the design of the hardware and software, the paper presents the results of simulation and experimental case studies examining the system\u27s performance and usability

    Bi-manual haptic interaction in virtual worlds

    No full text
    In the Virtual Reality field, force-feedback interfaces called haptic interfaces can simulate tactile and kinesthetic interactions. Bi-manual haptic interactions can better immerse users in virtual worlds than one hand interactions and more tasks can be realized such as parallel or precision tasks. Only a few studies deals specifically with bi-manual haptic interactions and previous work mainly extends uni-manual techniques directly to two hands. The document reports possible lacks of bi-manual-specific management of real and virtual workspace and the lack of genericity of solutions using haptic interfaces. The study on bi-manual haptic interactions led to the realization of a framework allowing to use simultaneously several haptic devices. This framework simulates a 3D virtual world coupled with a physical simulation. We realized new specifically bi-manual haptic interaction techniques allowing to control camera, to extend the virtual workspace by a hybrid position/rate control and to help bi-manual pick and place task. The document point out issues such as collision between haptic devices and unification of two different haptic interfaces

    Expanding Haptic Workspace for Coupled-Object Manipulation

    Get PDF
    Haptic force-feedback offers a valuable cue in exploration and manipulation of virtual environments. However, grounding of many commercial kinesthetic haptic devices limits the workspace accessible using a purely position-control scheme. The bubble technique has been recently presented as a method for expanding the user’s haptic workspace. The bubble technique is a hybrid position-rate control system in which a volume, or “bubble,” is defined entirely within the physical workspace of the haptic device. When the device’s end effector is within this bubble, interaction is through position control. When exiting this volume, an elastic restoring force is rendered, and a rate is applied that moves the virtual accessible workspace. Existing work on the bubble technique focuses on point-based touching tasks. When the bubble technique is applied to simulations where the user is grasping virtual objects with part-part collision detection, unforeseen interaction problems surface. This paper discusses three details of the user experience of coupled-object manipulation with the bubble technique. A few preliminary methods of addressing these interaction challenges are introduced

    Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits

    Exodex Adam—A Reconfigurable Dexterous Haptic User Interface for the Whole Hand

    Get PDF
    Applications for dexterous robot teleoperation and immersive virtual reality are growing. Haptic user input devices need to allow the user to intuitively command and seamlessly “feel” the environment they work in, whether virtual or a remote site through an avatar. We introduce the DLR Exodex Adam, a reconfigurable, dexterous, whole-hand haptic input device. The device comprises multiple modular, three degrees of freedom (3-DOF) robotic fingers, whose placement on the device can be adjusted to optimize manipulability for different user hand sizes. Additionally, the device is mounted on a 7-DOF robot arm to increase the user’s workspace. Exodex Adam uses a front-facing interface, with robotic fingers coupled to two of the user’s fingertips, the thumb, and two points on the palm. Including the palm, as opposed to only the fingertips as is common in existing devices, enables accurate tracking of the whole hand without additional sensors such as a data glove or motion capture. By providing “whole-hand” interaction with omnidirectional force-feedback at the attachment points, we enable the user to experience the environment with the complete hand instead of only the fingertips, thus realizing deeper immersion. Interaction using Exodex Adam can range from palpation of objects and surfaces to manipulation using both power and precision grasps, all while receiving haptic feedback. This article details the concept and design of the Exodex Adam, as well as use cases where it is deployed with different command modalities. These include mixed-media interaction in a virtual environment, gesture-based telemanipulation, and robotic hand–arm teleoperation using adaptive model-mediated teleoperation. Finally, we share the insights gained during our development process and use case deployments
    • …
    corecore