33 research outputs found

    FingerSight: A Vibrotactile Wearable Ring to Help the Blind Locate and Reach Objects in Peripersonal Space

    Get PDF
    Visually impaired people need a solution to compensate for the lack of visual information. Although assistive technologies exist to help them navigate through the environment, blind people rely on groping to locate and reach objects in peripersonal space. FingerSight solves this problem using visual-to-tactile substitution. Our prototype consists of four haptic tactors embedded into a ring worn on the index finger, with a tiny camera mounted on top. The camera image is processed using computer vision to control haptic feedback to the user. The four tactors are evenly spaced around the finger. Users are instructed to move their hand towards the target by vibrating the tactor in that direction, guiding their motion until they reach the target. At that point, all tactors vibrate simultaneously. Two experiments were conducted on normally-sighted participants, to test the functionality of our prototype. The first revealed that participants could discriminate between the five different haptic stimulations with a mean accuracy of 89.4%, which improved with additional training. In the second experiment, participants were blindfolded and instructed to move their hand wearing the device to reach one of four Light Emitting Diodes (LEDs) mounted on a cardboard sheet within arm’s reach. Infrared markers mounted on the device enabled its location to be recorded by an optical tracker. A computer vision algorithm located the LED in the camera image and controlled the tactors using two different strategies: (1) Worst Axis First and (2) Adjacent Tactor Pair. Results revealed that participants could follow the haptic instructions to reach the target with similar accuracy for both strategies, but that the time to reach the target was significantly different. Using control systems analysis, a closed loop proportional-integral-derivative (PID) controller and plant was simulated. A model for the plant was computed on the experimental data using autoregressive-moving average with exogenous terms (ARMAX), with the human subject acting as the plant. The control system was then optimized to find the best strategy for tactor activation, laying the groundwork for a future generation of FingerSight

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Human-Machine Interfaces using Distributed Sensing and Stimulation Systems

    Get PDF
    As the technology moves towards more natural human-machine interfaces (e.g. bionic limbs, teleoperation, virtual reality), it is necessary to develop a sensory feedback system in order to foster embodiment and achieve better immersion in the control system. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing a wide bandwidth of information. To provide this type of feedback, it is necessary to develop a distributed sensing system that could extract a wide range of information during the interaction between the robot and the environment. In addition, a distributed feedback interface is needed to deliver such information to the user. This thesis proposes the development of a distributed sensing system (e-skin) to acquire tactile sensation, a first integration of distributed sensing system on a robotic hand, the development of a sensory feedback system that compromises the distributed sensing system and a distributed stimulation system, and finally the implementation of deep learning methods for the classification of tactile data. It\u2019s core focus addresses the development and testing of a sensory feedback system, based on the latest distributed sensing and stimulation techniques. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives, and the used methodology and contributions; as well as six studies that tackled the development of human-machine interfaces

    Tactile displays, design and evaluation

    Get PDF
    Fritschi M. Tactile displays, design and evaluation. Bielefeld: UniversitÀt Bielefeld; 2016.This thesis presents the design and development of several tactile displays, as well as their eventual integration into a framework of tactile and kinesthetic stimulation. As a basis for the design of novel devices, an extensive survey of existing actuator principles and existing realizations of tactile displays is complemented by neurobiological and psychophysical findings. The work is structured along three main goals: First, novel actuator concepts are explored whose performance can match the challenging capabilities of human tactile perception. Second, novel kinematic concepts for experimental platforms are investigated that target an almost unknown sub-modality of tactile perception: The perception of shear force. Third, a setup for integrated tactile-kinesthetic displays is realized, and a first study on the psychophysical correlation between the tactile and the kinesthetic portion of haptic information is conducted. The developed devices proved to exceed human tactile capabilities and have already been used to learn more about the human tactile sense

    Fine-grained Haptics: Sensing and Actuating Haptic Primary Colours (force, vibration, and temperature)

    Get PDF
    This thesis discusses the development of a multimodal, fine-grained visual-haptic system for teleoperation and robotic applications. This system is primarily composed of two complementary components: an input device known as the HaptiTemp sensor (combines “Haptics” and “Temperature”), which is a novel thermosensitive GelSight-like sensor, and an output device, an untethered multimodal finegrained haptic glove. The HaptiTemp sensor is a visuotactile sensor that can sense haptic primary colours known as force, vibration, and temperature. It has novel switchable UV markers that can be made visible using UV LEDs. The switchable markers feature is a real novelty of the HaptiTemp because it can be used in the analysis of tactile information from gel deformation without impairing the ability to classify or recognise images. The use of switchable markers in the HaptiTemp sensor is the solution to the trade-off between marker density and capturing high-resolution images using one sensor. The HaptiTemp sensor can measure vibrations by counting the number of blobs or pulses detected per unit time using a blob detection algorithm. For the first time, temperature detection was incorporated into a GelSight-like sensor, making the HaptiTemp sensor a haptic primary colours sensor. The HaptiTemp sensor can also do rapid temperature sensing with a 643 ms response time for the 31°C to 50°C temperature range. This fast temperature response of the HaptiTemp sensor is comparable to the withdrawal reflex response in humans. This is the first time a sensor can trigger a sensory impulse that can mimic a human reflex in the robotic community. The HaptiTemp sensor can also do simultaneous temperature sensing and image classification using a machine vision camera—the OpenMV Cam H7 Plus. This capability of simultaneous sensing and image classification has not been reported or demonstrated by any tactile sensor. The HaptiTemp sensor can be used in teleoperation because it can communicate or transmit tactile analysis and image classification results using wireless communication. The HaptiTemp sensor is the closest thing to the human skin in tactile sensing, tactile pattern recognition, and rapid temperature response. In order to feel what the HaptiTemp sensor is touching from a distance, a corresponding output device, an untethered multimodal haptic hand wearable, is developed to actuate the haptic primary colours sensed by the HaptiTemp sensor. This wearable can communicate wirelessly and has fine-grained cutaneous feedback to feel the edges or surfaces of the tactile images captured by the HaptiTemp sensor. This untethered multimodal haptic hand wearable has gradient kinesthetic force feedback that can restrict finger movements based on the force estimated by the HaptiTemp sensor. A retractable string from an ID badge holder equipped with miniservos that control the stiffness of the wire is attached to each fingertip to restrict finger movements. Vibrations detected by the HaptiTemp sensor can be actuated by the tapping motion of the tactile pins or by a buzzing minivibration motor. There is also a tiny annular Peltier device, or ThermoElectric Generator (TEG), with a mini-vibration motor, forming thermo-vibro feedback in the palm area that can be activated by a ‘hot’ or ‘cold’ signal from the HaptiTemp sensor. The haptic primary colours can also be embedded in a VR environment that can be actuated by the multimodal hand wearable. A VR application was developed to demonstrate rapid tactile actuation of edges, allowing the user to feel the contours of virtual objects. Collision detection scripts were embedded to activate the corresponding actuator in the multimodal haptic hand wearable whenever the tactile matrix simulator or hand avatar in VR collides with a virtual object. The TEG also gets warm or cold depending on the virtual object the participant has touched. Tests were conducted to explore virtual objects in 2D and 3D environments using Leap Motion control and a VR headset (Oculus Quest 2). Moreover, a fine-grained cutaneous feedback was developed to feel the edges or surfaces of a tactile image, such as the tactile images captured by the HaptiTemp sensor, or actuate tactile patterns in 2D or 3D virtual objects. The prototype is like an exoskeleton glove with 16 tactile actuators (tactors) on each fingertip, 80 tactile pins in total, made from commercially available P20 Braille cells. Each tactor can be controlled individually to enable the user to feel the edges or surfaces of images, such as the high-resolution tactile images captured by the HaptiTemp sensor. This hand wearable can be used to enhance the immersive experience in a virtual reality environment. The tactors can be actuated in a tapping manner, creating a distinct form of vibration feedback as compared to the buzzing vibration produced by a mini-vibration motor. The tactile pin height can also be varied, creating a gradient of pressure on the fingertip. Finally, the integration of the high-resolution HaptiTemp sensor, and the untethered multimodal, fine-grained haptic hand wearable is presented, forming a visuotactile system for sensing and actuating haptic primary colours. Force, vibration, and temperature sensing tests with corresponding force, vibration, and temperature actuating tests have demonstrated a unified visual-haptic system. Aside from sensing and actuating haptic primary colours, touching the edges or surfaces of the tactile images captured by the HaptiTemp sensor was carried out using the fine-grained cutaneous feedback of the haptic hand wearable

    Requirements for a tactile display of softness

    Get PDF
    Developing tactile displays is an important aspect of improving the realism of feeling softness in laparoscopic surgery. One of the major challenges of designing a tactile display is to understand how the perception of touch can be perceived with differences in material properties. This project seeks to address this limitation by investigating how the interaction of material properties affects perception of softness and to present the perception of softness through a tactile display. The first aim explores how the interaction of material properties affects perception of softness through the use of two psychophysical experiments. Experiments used a set of nine stimuli representing three materials of different compliance, with three different patterns of surface roughness or with three different coatings of stickiness. The results indicated that compliance affected perception of softness when pressing the finger, but not when sliding; and that compliance, friction and thermal conductivity all influenced the perception of softness. To achieve the second aim of reproducing various levels of softnesses, the tactile display was built at the University of Leeds. The displayed softness was controlled by changing the contact area and tension of a flexible sheet. Psychophysical experiments were conducted to evaluate how well humans perceive softness through the display. The data was analysed using MatLab to plot psychometric functions. The results indicated that the tactile display might be good for some applications which need to compare between simulated softnesses, but it might be insufficient for other applications which need to compare between simulated softness and real samples

    Identification of Mechanical Properties of Nonlinear Materials and Development of Tactile Displays for Robotic Assisted Surgery Applications

    Get PDF
    This PhD work presents novel methods of mechanical property identification for soft nonlinear materials and methods of recreating and modeling the deformation behavior of these nonlinear materials for tactile feedback systems. For the material property identification, inverse modeling method is employed for the identification of hyperelastic and hyper-viscoelastic (HV) materials by use of the spherical indentation test. Identification experiments are performed on soft foam materials and fresh harvested bovine liver tissue. It is shown that reliability and accuracy of the identified material parameters are directly related to size of the indenter and depth of the indentation. Results show that inverse FE modeling based on MultiStart optimization algorithm and the spherical indentation, is a reliable and scalable method of identification for biological tissues based on HV constitutive models. The inverse modeling method based on the spherical indentation is adopted for realtime applications using variation and Kalman filter methods. Both the methods are evaluated on hyperelastic foams and biological tissues on experiments which are analogous to the robot assisted surgery. Results of the experiments are compared and discussed for the proposed methods. It is shown that increasing the indentation rate eliminates time dependency in material behavior, thus increases the successful recognition rate. The deviation of an identified parameter at indentation rates of V=1, 2 and 4 mm/s was found as 28%, 21.3% and 7.3%. It is found that although the Kalman filter method yields less dispersion in identified parameters compared to the variance method, it requires almost 900 times more computation power compared to the variance method, which is a limiting factor for increasing the indentation rate. Three bounding methods are proposed and implemented for the Kalman filter estimation. It was found that the Projection and Penalty bounding methods yield relatively accurate results without failure. However, the Nearest Neighbor method found with a high chance of non-convergence. The second part of the thesis is focused on the development of tactile displays for modeling the mechanical behavior of the nonlinear materials for human tactile perception. An accurate finite element (FE) model of human finger pad is constructed and validated in experiments of finger pad contact with soft and relatively rigid materials. Hyperfoam material parameters of the identified elastomers from the previous section are used for validation of the finger pad model. A magneto-rheological fluid (MRF) based tactile display is proposed and its magnetic FE model is constructed and validated in Gauss meter measurements. FE models of the human finger pad and the proposed tactile display are used in a model based control algorithm for the proposed display. FE models of the identified elastomers are used for calculation of control curves for these elastomers. An experiment is set up for evaluation of the proposed display. Experiments are performed on biological tissue and soft nonlinear foams. Comparison between curves of desired and recreated reaction force from subject's finger pad contact with the display showed above 84% accuracy. As a complementary work, new modeling and controlling approaches are proposed and tested for tactile displays based on linear actuators. Hertzian model of contact between the human finger pad and actuator cap is derived and curves of material deformation are obtained and improved based on this model. A PID controller is designed for controlling the linear actuators. Optimization based controller tuning approach is explained in detail and robust stability of the system is also investigated. Results showed maximum tracking error of 16.6% for the actuator controlled by the PID controller. Human subject tests of recreated softness perception show 100% successful recognition rate for group of materials with high difference in their softness

    C-9 and Other Microgravity Simulations

    Get PDF
    This document represents a summary of medical and scientific evaluations conducted aboard the C-9 and other NASA-sponsored aircraft from June 2008 to June 2009. Included is a general overview of investigations manifested and coordinated by the Human Adaptation and Counter-measures Division. A collection of brief reports that describe tests conducted aboard the NASA-sponsored aircraft follows the overview. Principal investigators and test engineers contributed significantly to the content of the report, describing their particular experiment or hardware evaluation. Although this document follows general guidelines, each report format may vary to accommodate differences in experiment design and procedures. This document concludes with an appendix that provides background information concerning the Reduced Gravity Program. Acknowledgment
    corecore