803 research outputs found
Haptic feedback in teleoperation in Micro-and Nano-Worlds.
International audienceRobotic systems have been developed to handle very small objects, but their use remains complex and necessitates long-duration training. Simulators, such as molecular simulators, can provide access to large amounts of raw data, but only highly trained users can interpret the results of such systems. Haptic feedback in teleoperation, which provides force-feedback to an operator, appears to be a promising solution for interaction with such systems, as it allows intuitiveness and flexibility. However several issues arise while implementing teleoperation schemes at the micro-nanoscale, owing to complex force-fields that must be transmitted to users, and scaling differences between the haptic device and the manipulated objects. Major advances in such technology have been made in recent years. This chapter reviews the main systems in this area and highlights how some fundamental issues in teleoperation for micro- and nano-scale applications have been addressed. The chapter considers three types of teleoperation, including: (1) direct (manipulation of real objects); (2) virtual (use of simulators); and (3) augmented (combining real robotic systems and simulators). Remaining issues that must be addressed for further advances in teleoperation for micro-nanoworlds are also discussed, including: (1) comprehension of phenomena that dictate very small object (< 500 micrometers) behavior; and (2) design of intuitive 3-D manipulation systems. Design guidelines to realize an intuitive haptic feedback teleoperation system at the micro-nanoscale level are proposed
Haptics in Robot-Assisted Surgery: Challenges and Benefits
Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts
A Review of Pneumatic Actuators Used for the Design of Medical Simulators and Medical Tools
International audienc
Standardized evaluation of haptic rendering systems
The development and evaluation of haptic rendering algorithms presents two unique challenges. Firstly, the haptic information channel is fundamentally bidirectional, so the output of a haptic environment is fundamentally dependent on user input, which is difficult to reliably reproduce. Additionally, it is difficult to compare haptic results to real-world, "gold standard" results, since such a comparison requires applying identical inputs to real and virtual objects and measuring the resulting forces, which requires hardware that is not widely available. We have addressed these challenges by building and releasing several sets of position and force information, collected by physically scanning a set of real-world objects, along with virtual models of those objects. We demonstrate novel applications of this data set for the development, debugging, optimization, evaluation, and comparison of haptic rendering algorithms
Constraint-based technique for haptic volume exploration
Journal ArticleWe present a haptic rendering technique that uses directional constraints to facilitate enhanced exploration modes for volumetric datasets. The algorithm restricts user motion in certain directions by incrementally moving a proxy point along the axes of a local reference frame. Reaction forces are generated by a spring coupler between the proxy and the data probe, which can be tuned to the capabilities of the haptic interface. Secondary haptic effects including field forces, friction, and texture can be easily incorporated to convey information about additional characteristics of the data. We illustrate the technique with two examples: displaying fiber orientation in heart muscle layers and exploring diffusion tensor fiber tracts in brain white matter tissue. Initial evaluation of the approach indicates that haptic constraints provide an intuitive means for displaying directional information in volume data
Active haptic perception in robots: a review
In the past few years a new scenario for robot-based applications has emerged. Service
and mobile robots have opened new market niches. Also, new frameworks for shop-floor
robot applications have been developed. In all these contexts, robots are requested to
perform tasks within open-ended conditions, possibly dynamically varying. These new
requirements ask also for a change of paradigm in the design of robots: on-line and safe
feedback motion control becomes the core of modern robot systems. Future robots will
learn autonomously, interact safely and possess qualities like self-maintenance. Attaining
these features would have been relatively easy if a complete model of the environment
was available, and if the robot actuators could execute motion commands perfectly
relative to this model. Unfortunately, a complete world model is not available and robots
have to plan and execute the tasks in the presence of environmental uncertainties which
makes sensing an important component of new generation robots. For this reason,
today\u2019s new generation robots are equipped with more and more sensing components,
and consequently they are ready to actively deal with the high complexity of the real
world. Complex sensorimotor tasks such as exploration require coordination between the
motor system and the sensory feedback. For robot control purposes, sensory feedback
should be adequately organized in terms of relevant features and the associated data
representation. In this paper, we propose an overall functional picture linking sensing
to action in closed-loop sensorimotor control of robots for touch (hands, fingers). Basic
qualities of haptic perception in humans inspire the models and categories comprising the
proposed classification. The objective is to provide a reasoned, principled perspective on
the connections between different taxonomies used in the Robotics and human haptic
literature. The specific case of active exploration is chosen to ground interesting use
cases. Two reasons motivate this choice. First, in the literature on haptics, exploration has
been treated only to a limited extent compared to grasping and manipulation. Second,
exploration involves specific robot behaviors that exploit distributed and heterogeneous
sensory data
Haptics Rendering and Applications
There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future
W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality
Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction
- …