49 research outputs found

    Cortical Regions Encoding Hardness Perception Modulated by Visual Information Identified by Functional Magnetic Resonance Imaging With Multivoxel Pattern Analysis

    Get PDF
    Recent studies have revealed that hardness perception is determined by visual information along with the haptic input. This study investigated the cortical regions involved in hardness perception modulated by visual information using functional magnetic resonance imaging (fMRI) and multivoxel pattern analysis (MVPA). Twenty-two healthy participants were enrolled. They were required to place their left and right hands at the front and back, respectively, of a mirror attached to a platform placed above them while lying in a magnetic resonance scanner. In conditions SFT, MED, and HRD, one of three polyurethane foam pads of varying hardness (soft, medium, and hard, respectively) was presented to the left hand in a given trial, while only the medium pad was presented to the right hand in all trials. MED was defined as the control condition, because the visual and haptic information was congruent. During the scan, the participants were required to push the pad with the both hands while observing the reflection of the left hand and estimate the hardness of the pad perceived by the right (hidden) hand based on magnitude estimation. Behavioral results showed that the perceived hardness was significantly biased toward softer or harder in >73% of the trials in conditions SFT and HRD; we designated these trials as visually modulated (SFTvm and HRDvm, respectively). The accuracy map was calculated individually for each of the pair-wise comparisons of (SFTvm vs. MED), (HRDvm vs. MED), and (SFTvm vs. HRDvm) by a searchlight MVPA, and the cortical regions encoding the perceived hardness with visual modulation were identified by conjunction of the three accuracy maps in group analysis. The cluster was observed in the right sensory motor cortex, left anterior intraparietal sulcus (aIPS), bilateral parietal operculum (PO), and occipito-temporal cortex (OTC). Together with previous findings on such cortical regions, we conclude that the visual information of finger movements processed in the OTC may be integrated with haptic input in the left aIPS, and the subjective hardness perceived by the right hand with visual modulation may be processed in the cortical network between the left PO and aIPS

    Pseudo-haptics survey: Human-computer interaction in extended reality & teleoperation

    Get PDF
    Pseudo-haptic techniques are becoming increasingly popular in human-computer interaction. They replicate haptic sensations by leveraging primarily visual feedback rather than mechanical actuators. These techniques bridge the gap between the real and virtual worlds by exploring the brain’s ability to integrate visual and haptic information. One of the many advantages of pseudo-haptic techniques is that they are cost-effective, portable, and flexible. They eliminate the need for direct attachment of haptic devices to the body, which can be heavy and large and require a lot of power and maintenance. Recent research has focused on applying these techniques to extended reality and mid-air interactions. To better understand the potential of pseudo-haptic techniques, the authors developed a novel taxonomy encompassing tactile feedback, kinesthetic feedback, and combined categories in multimodal approaches, ground not covered by previous surveys. This survey highlights multimodal strategies and potential avenues for future studies, particularly regarding integrating these techniques into extended reality and collaborative virtual environments.info:eu-repo/semantics/publishedVersio

    Enriching passive touch sensation on flat surfaces using visual feedback

    Get PDF
    While human computer interaction has evolved around touch interaction a lot in recent years, it's been lacking any haptic feedback from the very beginning. Nowadays, devices using touch interaction all to do on a flat surface, using either a projection of digital contents or a touch screen. Since haptic feedback is an important factor in human surface perception, people have tried various ways to simulate haptic feedback even on completely flat surfaces. One of these ways is electrotacile feedback, which has mostly been used to simulate surface properties on active touch where the user has to move their finger over the surface in order to feel the haptic sensation. Previous research shows that vision is also a very important factor in surface perception and proprioception in general. We conducted a user study to investigate the influence of visual feedback on passive touch using electrotactile feedback. We concentrated on simulating depth instead of roughness which doesn't word particularly well for passive touch. We found that even though both electrotactile and visual feedback work well for depth or softness if applied individually, as soon as we presented our study subjects a condition with both feedback types, they did not repond to it anymore.WĂ€hrend sich die Mensch-Computer-Interaktion in den letzten Jahren stark um Touch-Interaktion entwickelt hat, hat dieser Interaktion von Anfang an jegliche Form des haptischen Feedbacks gefehlt. Heutzutage nuten alle touchfĂ€higen Gerate flache Displays oder Projektionen von digitalen Inhalten auf flache OberflĂ€chen. Da haptisches Feedback ein wichtiger Faktor der menschlichen OberflĂ€chenwahrnehmung ist wurden schon viele Wege erforscht um haptisches Feedback auf komplett flachen OberflĂ€chen zu simulieren. Eine dieser Wege ist elektrotaktiles Feedback was bisher hauptsĂ€chlich benutzt wurde um OberflĂ€cheneigenschaften bei aktiver BerĂŒhrung zu simulieren, also bei einem sich bewegenden Finger auf der OberflĂ€che. Vorige Studien zeigen auch, dass Visuelle Reize ein wichtiger Faktor bei der OberflĂ€chenwahrnehmung sind und sogar die Wahrnehmung im Generellen dominieren. Wir haben eine Benutzerstudie durchgefĂŒhrt um den Einfluss von visuellem Feedback auf passive BerĂŒhrungen mit elektrotaktilem Feedback zu bestimmen. Wir haben und auf die Simulaton von Tiefe statt Rauhheit konzentriert, was schlecht mit passiven BerĂŒhrungen funktioniert. Unsere Studie hat gezeigt dass obwohl das elektrotaktile und das visuelle Feedback alleine gut funktionieren um Weichheit oder Tiefe zu simulieren, beide Feedbackarten zusammen keine signifikanten Unterschiede erzielen

    Requirements for a tactile display of softness

    Get PDF
    Developing tactile displays is an important aspect of improving the realism of feeling softness in laparoscopic surgery. One of the major challenges of designing a tactile display is to understand how the perception of touch can be perceived with differences in material properties. This project seeks to address this limitation by investigating how the interaction of material properties affects perception of softness and to present the perception of softness through a tactile display. The first aim explores how the interaction of material properties affects perception of softness through the use of two psychophysical experiments. Experiments used a set of nine stimuli representing three materials of different compliance, with three different patterns of surface roughness or with three different coatings of stickiness. The results indicated that compliance affected perception of softness when pressing the finger, but not when sliding; and that compliance, friction and thermal conductivity all influenced the perception of softness. To achieve the second aim of reproducing various levels of softnesses, the tactile display was built at the University of Leeds. The displayed softness was controlled by changing the contact area and tension of a flexible sheet. Psychophysical experiments were conducted to evaluate how well humans perceive softness through the display. The data was analysed using MatLab to plot psychometric functions. The results indicated that the tactile display might be good for some applications which need to compare between simulated softnesses, but it might be insufficient for other applications which need to compare between simulated softness and real samples

    Electrotactile feedback applications for hand and arm interactions: A systematic review, meta-analysis, and future directions

    Get PDF
    Haptic feedback is critical in a broad range of human-machine/computer-interaction applications. However, the high cost and low portability/wearability of haptic devices remain unresolved issues, severely limiting the adoption of this otherwise promising technology. Electrotactile interfaces have the advantage of being more portable and wearable due to their reduced actuators' size, as well as their lower power consumption and manufacturing cost. The applications of electrotactile feedback have been explored in human-computer interaction and human-machine-interaction for facilitating hand-based interactions in applications such as prosthetics, virtual reality, robotic teleoperation, surface haptics, portable devices, and rehabilitation. This paper presents a technological overview of electrotactile feedback, as well a systematic review and meta-analysis of its applications for hand-based interactions. We discuss the different electrotactile systems according to the type of application. We also discuss over a quantitative congregation of the findings, to offer a high-level overview into the state-of-art and suggest future directions. Electrotactile feedback systems showed increased portability/wearability, and they were successful in rendering and/or augmenting most tactile sensations, eliciting perceptual processes, and improving performance in many scenarios. However, knowledge gaps (e.g., embodiment), technical (e.g., recurrent calibration, electrodes' durability) and methodological (e.g., sample size) drawbacks were detected, which should be addressed in future studies.Comment: 18 pages, 1 table, 8 figures, under review in Transactions on Haptics. This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.Upon acceptance of the article by IEEE, the preprint article will be replaced with the accepted versio

    Modeling of frictional forces during bare-finger interactions with solid surfaces

    Get PDF
    Touching an object with our fingers yields frictional forces that allow us to perceive and explore its texture, shape, and other features, facilitating grasping and manipulation. While the relevance of dynamic frictional forces to sensory and motor function in the hand is well established, the way that they reflect the shape, features, and composition of touched objects is poorly understood. Haptic displays -electronic interfaces for stimulating the sense of touch- often aim to elicit the perceptual experience of touching real surfaces by delivering forces to the fingers that mimic those felt when touching real surfaces. However, the design and applications of such displays have been limited by the lack of knowledge about what forces are felt during real touch interactions. This represents a major gap in current knowledge about tactile function and haptic engineering. This dissertation addresses some aspects that would assist in their understanding. The goal of this research was to measure, characterize, and model frictional forces produced by a bare finger sliding over surfaces of multiple shapes. The major contributions of this work are (1) the design and development of a sensing system for capturing fingertip motion and forces during tactile exploration of real surfaces; (2) measurement and characterization of contact forces and the deformation of finger tissues during sliding over relief surfaces; (3) the development of a low order model of frictional force production based on surface specifications; (4) the analysis and modeling of contact geometry, interfacial mechanics, and their effects in frictional force production during tactile exploration of relief surfaces. This research aims to guide the design of algorithms for the haptic rendering of surface textures and shape. Such algorithms can be used to enhance human-machine interfaces, such as touch-screen displays, by (1) enabling users to feel surface characteristics also presented visually; (2) facilitating interaction with these devices; and (3) reducing the need for visual input to interact with them.Ph.D., Electrical Engineering -- Drexel University, 201

    Move or Push? Studying Pseudo-Haptic Perceptions Obtained with Motion or Force Input

    Full text link
    Pseudo-haptics techniques are interesting alternatives for generating haptic perceptions, which entails the manipulation of haptic perception through the appropriate alteration of primarily visual feedback in response to body movements. However, the use of pseudo-haptics techniques with a motion-input system can sometimes be limited. This paper investigates a novel approach for extending the potential of pseudo-haptics techniques in virtual reality (VR). The proposed approach utilizes a reaction force from force-input as a substitution of haptic cue for the pseudo-haptic perception. The paper introduced a manipulation method in which the vertical acceleration of the virtual hand is controlled by the extent of push-in of a force sensor. Such a force-input manipulation of a virtual body can not only present pseudo-haptics with less physical spaces and be used by more various users including physically handicapped people, but also can present the reaction force proportional to the user's input to the user. We hypothesized that such a haptic force cue would contribute to the pseudo-haptic perception. Therefore, the paper endeavors to investigate the force-input pseudo-haptic perception in a comparison with the motion-input pseudo-haptics. The paper compared force-input and motion-input manipulation in a point of achievable range and resolution of pseudo-haptic weight. The experimental results suggest that the force-input manipulation successfully extends the range of perceptible pseudo-weight by 80\% in comparison to the motion-input manipulation. On the other hand, it is revealed that the motion-input manipulation has 1 step larger number of distinguishable weight levels and is easier to operate than the force-input manipulation.Comment: This paper is now under review for IEEE Transactions on Visualization and Computer Graphic

    Human haptic perception in virtual environments: An investigation of the interrelationship between physical stiffness and perceived roughness.

    Get PDF
    Research in the area of haptics and how we perceive the sensations that come from haptic interaction started almost a century ago, yet there is little fundamental knowledge as to how and whether a change in the physical values of one characteristic can alter the perception of another. The increasing availability of haptic interaction through the development of force-feedback devices opens new possibilities in interaction, allowing for accurate real time change of physical attributes on virtual objects in order to test the haptic perception changes to the human user. An experiment was carried out to ascertain whether a change in the stiffness value would have a noticeable effect on the perceived roughness of a virtual object. Participants were presented with a textured surface and were asked to estimate how rough it felt compared to a standard. What the participants did not know was that the simulated texture on both surfaces remained constant and the only physical attribute changing in every trial was the comparison object’s surface stiffness. The results showed that there is a strong relationship between physical stiffness and perceived roughness that can be accurately described by a power function, and the roughness magnitude estimations of roughness showed an increase with increasing stiffness values. The conclusion is that there are relationships between these parameters, where changes in the physical stiffness of a virtual object can change how rough it is perceived to be in a very clear and predictable way. Extending this study can lead to an investigation on how other physical attributes affects one or more perceived haptic dimensions and subsequently insights can be used for constructing something like a haptic pallet for a haptic display designer, where altering one physical attribute can in turn change a whole array of perceived haptic dimensions in a clear and predictable way

    The interaction between motion and texture in the sense of touch

    Get PDF
    Besides providing information on elementary properties of objects, like texture, roughness, and softness, the sense of touch is also important in building a representation of object movement and the movement of our hands. Neural and behavioral studies shed light on the mechanisms and limits of our sense of touch in the perception of texture and motion, and of its role in the control of movement of our hands. The interplay between the geometrical and mechanical properties of the touched objects, such as shape and texture, the movement of the hand exploring the object, and the motion felt by touch, will be discussed in this article. Interestingly, the interaction between motion and textures can generate perceptual illusions in touch. For example, the orientation and the spacing of the texture elements on a static surface induces the illusion of surface motion when we move our hand on it or can elicit the perception of a curved trajectory during sliding, straight hand movements. In this work we present a multiperspective view that encompasses both the perceptual and the motor aspects, as well as the response of peripheral and central nerve structures, to analyze and better understand the complex mechanisms underpinning the tactile representation of texture and motion. Such a better understanding of the spatiotemporal features of the tactile stimulus can reveal novel transdisciplinary applications in neuroscience and haptics
    corecore