24 research outputs found

    W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality

    Get PDF
    Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction

    A Soft touch: wearable dielectric elastomer actuated multi-finger soft tactile displays

    Get PDF
    PhDThe haptic modality in human-computer interfaces is significantly underutilised when compared to that of vision and sound. A potential reason for this is the difficulty in turning computer-generated signals into realistic sensations of touch. Moreover, wearable solutions that can be mounted onto multiple fingertips whilst still allowing for the free dexterous movements of the user’s hand, brings an even higher level of complexity. In order to be wearable, such devices should not only be compact, lightweight and energy efficient; but also, be able to render compelling tactile sensations. Current solutions are unable to meet these criteria, typically due to the actuation mechanisms employed. Aimed at addressing these needs, this work presents research into non-vibratory multi-finger wearable tactile displays, through the use of an improved configuration of a dielectric elastomer actuator. The described displays render forces through a soft bubble-like interface worn on the fingertip. Due to the improved design, forces of up to 1N can be generated in a form factor of 20 x 12 x 23 mm, with a weight of only 6g, demonstrating a significant performance increase in force output and wearability over existing tactile rendering systems. Furthermore, it is shown how these compact wearable devices can be used in conjunction with low-cost commercial optical hand tracking sensors, to cater for simple although accurate tactile interactions within virtual environments, using affordable instrumentation. The whole system makes it possible for users to interact with virtually generated soft body objects with programmable tactile properties. Through a 15-participant study, the system has been validated for three distinct types of touch interaction, including palpation and pinching of virtual deformable objects. Through this investigation, it is believed that this approach could have a significant impact within virtual and augmented reality interaction for purposes of medical simulation, professional training and improved tactile feedback in telerobotic control systems.Engineering and Physical Sciences Research Council (EPSRC) Doctoral Training Centre EP/G03723X/

    A Novel Untethered Hand Wearable with Fine-Grained Cutaneous Haptic Feedback

    Get PDF
    During open surgery, a surgeon relies not only on the detailed view of the organ being operated upon and on being able to feel the fine details of this organ but also heavily relies on the combination of these two senses. In laparoscopic surgery, haptic feedback provides surgeons information on interaction forces between instrument and tissue. There have been many studies to mimic the haptic feedback in laparoscopic-related telerobotics studies to date. However, cutaneous feedback is mostly restricted or limited in haptic feedback-based minimally invasive studies. We argue that fine-grained information is needed in laparoscopic surgeries to study the details of the instrument’s end and can convey via cutaneous feedback. We propose an exoskeleton haptic hand wearable which consists of five 4 ⇥ 4 miniaturized fingertip actuators, 80 in total, to convey cutaneous feedback. The wearable is described as modular, lightweight, Bluetooth, and WiFi-enabled, and has a maximum power consumption of 830 mW. Software is developed to demonstrate rapid tactile actuation of edges; this allows the user to feel the contours in cutaneous feedback. Moreover, to demonstrate the idea as an object displayed on a flat monitor, initial tests were carried out in 2D. In the second phase, the wearable exoskeleton glove is then further developed to feel 3D virtual objects by using a virtual reality (VR) headset demonstrated by a VR environment. Two-dimensional and 3D objects were tested by our novel untethered haptic hand wearable. Our results show that untethered humans understand actuation in cutaneous feedback just in a single tapping with 92.22% accuracy. Our wearable has an average latency of 46.5 ms, which is much less than the 600 ms tolerable delay acceptable by a surgeon in teleoperation. Therefore, we suggest our untethered hand wearable to enhance multimodal perception in minimally invasive surgeries to naturally feel the immediate environments of the instruments

    Determining the Contribution of Visual and Haptic Cues during Compliance Discrimination in the Context of Minimally Invasive Surgery

    Get PDF
    While minimally invasive surgery is replacing open surgery in an increasing number of surgical procedures, it still poses risks such as unintended tissue damage due to reduced visual and haptic feedback. Surgeons assess tissue health by analysing mechanical properties such as compliance. The literature shows that while both types of feedback contribute to the final percept, visual information is dominant during compliance discrimination tasks. The magnitude of that contribution, however, was never quantitatively determined. To determine the effect of the type of visual feedback on compliance discrimination, a psychophysical experiment was set up using different combinations of direct and indirect visual and haptic cues. Results reiterated the significance of visual information and suggested a visio-haptic cross-modal integration. Consequently, to determine which cues contributed most to visual feedback, the impact of force and position on the ability to discriminate compliance using visual information only was assessed. Results showed that isolating force and position cues during indentation enhanced performance. Furthermore, under force and position constraints, visual information was shown to be sufficient to determine the compliance of deformable objects. A pseudo-haptic feedback system was developed to quantitatively determine the contribution of visual feedback during compliance discrimination. A psychophysical experiment showed that the system realistically simulated viscoelastic behaviour of compliant objects. Through a magnitude estimation experiment, the pseudo-haptic system was shown to be successful at generating haptic sensations of compliance during stimuli indentation only by modifying the visual feedback presented to participants. This can be implemented in research and educational facilities where advanced force feedback devices are inaccessible. Moreover, it can be incorporated into virtual reality simulators to enhance force ranges. Future work will assess the value of visual cue augmentation in more complicated surgical tasks

    Doctor of Philosophy

    Get PDF
    dissertationVirtual reality is becoming a common technology with applications in fields such as medical training, product development, and entertainment. Providing haptic (sense of touch) information along with visual and audio information can create an immersive vi

    Evaluation of Haptic Feedback on Bimanually Teleoperated Laparoscopy for Endometriosis Surgery

    Get PDF
    Robotic minimal invasive surgery is gaining acceptance in surgical care. In contrast with the appreciated 3D vision and enhanced dexterity, haptic feedback is not offered. For this reason robotics is not considered beneficial for delicate interventions such the endometriosis. Overall, haptic feedback remains debatable and yet unproven except for some simple scenarios such as Fundamentals of Laparoscopic Surgery exercises. OBJECTIVE: The present work investigates the benefits of haptic feedback on more complex surgical gestures, manipulating delicate tissue through coordination between multiple instruments. Methods: A new training exercise, “Endometriosis Surgery Exercise” (ESE) has been devised approximating the setting for monocular robotic endometriosis treatment. A bimanual bilateral teleoperation setup was designed for laparoscopic laser surgery. Haptic guidance and haptic feedback are respectively offered to the operator. User experiments have been conducted to i) assess the validity of ESE and to ii) examine possible advantages of haptic technology during execution of bimanual surgery. RESULTS: i) Content and face validity of ESE was established by participating surgeons. Surgeons suggested ESE also as a means to train lasering skills, ii) interaction forces on endometriotic tissue were found to be significantly lower when a bilateral controller is used. Collisions between instruments and the environment were less frequent and so were situations marked as potentially dangerous. CONCLUSION: This study provides some promising results suggesting that haptics may offer a distinct advantage in complex robotic interventions were fragile tissue is manipulated. SIGNIFICANCE: Patients need to know whether it should be incorporated. Improved understanding of the value of haptics is important as current commercial surgical robots are widely used but do not offer haptics

    Haptic Enhancement of Sensorimotor Learning for Clinical Training Applications

    Get PDF
    Modern surgical training requires radical change with the advent of increasingly complex procedures, restricted working hours, and reduced ‘hands-on’ training in the operating theatre. Moreover, an increased focus on patient safety means there is a greater need to objectively measure proficiency in trainee surgeons. Indeed, the existing evidence suggests that surgical sensorimotor skill training is not adequate for modern surgery. This calls for new training methodologies which can increase the acquisition rate of sensorimotor skill. Haptic interventions offer one exciting possible avenue for enhancing surgical skills in a safe environment. Nevertheless, the best approach for implementing novel training methodologies involving haptic intervention within existing clinical training curricula has yet to be determined. This thesis set out to address this issue. In Chapter 2, the development of two novel tools which enable the implementation of bespoke visuohaptic environments within robust experimental protocols is described. Chapters 3 and 4 report the effects of intensive, long-term training on the acquisition of a compliance discrimination skill. The results indicate that active behaviour is intrinsically linked to compliance perception, and that long-term training can help to improve the ability of detecting compliance differences. Chapter 5 explores the effects of error augmentation and parameter space exploration on the learning of a complex novel task. The results indicate that error augmentation can help improve learning rate, and that physical workspace exploration may be a driver for motor learning. This research is a first step towards the design of objective haptic intervention strategies to help support the rapid acquisition of sensorimotor skill. The work has applications in clinical settings such as surgical training, dentistry and physical rehabilitation, as well as other areas such as sport

    Delivering Expressive And Personalized Fingertip Tactile Cues

    Get PDF
    Wearable haptic devices have seen growing interest in recent years, but providing realistic tactile feedback is not a challenge that is soon to be solved. Daily interactions with physical objects elicit complex sensations at the fingertips. Furthermore, human fingertips exhibit a broad range of physical dimensions and perceptive abilities, adding increased complexity to the task of simulating haptic interactions in a compelling manner. However, as the applications of wearable haptic feedback grow, concerns of wearability and generalizability often persuade tactile device designers to simplify the complexities associated with rendering realistic haptic sensations. As such, wearable devices tend to be optimized for particular uses and average users, rendering only the most salient dimensions of tactile feedback for a given task and assuming all users interpret the feedback in a similar fashion. We propose that providing more realistic haptic feedback will require in-depth examinations of higher-dimensional tactile cues and personalization of these cues for individual users. In this thesis, we aim to provide hardware and software-based solutions for rendering more expressive and personalized tactile cues to the fingertip. We first explore the idea of rendering six-degree-of-freedom (6-DOF) tactile fingertip feedback via a wearable device, such that any possible fingertip interaction with a flat surface can be simulated. We highlight the potential of parallel continuum manipulators (PCMs) to meet the requirements of such a device, and we refine the design of a PCM for providing fingertip tactile cues. We construct a manually actuated prototype to validate the concept, and then continue to develop a motorized version, named the Fingertip Puppeteer, or Fuppeteer for short. Various error reduction techniques are presented, and the resulting device is evaluated by analyzing system responses to step inputs, measuring forces rendered to a biomimetic finger sensor, and comparing intended sensations to perceived sensations of twenty-four participants in a human-subject study. Once the functionality of the Fuppeteer is validated, we begin to explore how the device can be used to broaden our understanding of higher-dimensional tactile feedback. One such application is using the 6-DOF device to simulate different lower-dimensional devices. We evaluate 1-, 3-, and 6-DOF tactile feedback during shape discrimination and mass discrimination in a virtual environment, also comparing to interactions with real objects. Results from 20 naive study participants show that higher-dimensional tactile feedback may indeed allow completion of a wider range of virtual tasks, but that feedback dimensionality surprisingly does not greatly affect the exploratory techniques employed by the user. To address alternative approaches to improving tactile rendering in scenarios where low-dimensional tactile feedback is appropriate, we then explore the idea of personalizing feedback for a particular user. We present two generalizable software-based approaches to personalize an existing data-driven haptic rendering algorithm for fingertips of different sizes. We evaluate our algorithms in the rendering of pre-recorded tactile sensations onto rubber casts of six different fingertips as well as onto the real fingertips of 13 human participants, all via a 3-DOF wearable device. Results show that both personalization approaches significantly reduced force error magnitudes and improved realism ratings
    corecore