398 research outputs found

    Spatial-Temporal Characteristics of Multisensory Integration

    Get PDF
    abstract: We experience spatial separation and temporal asynchrony between visual and haptic information in many virtual-reality, augmented-reality, or teleoperation systems. Three studies were conducted to examine the spatial and temporal characteristic of multisensory integration. Participants interacted with virtual springs using both visual and haptic senses, and their perception of stiffness and ability to differentiate stiffness were measured. The results revealed that a constant visual delay increased the perceived stiffness, while a variable visual delay made participants depend more on the haptic sensations in stiffness perception. We also found that participants judged stiffness stiffer when they interact with virtual springs at faster speeds, and interaction speed was positively correlated with stiffness overestimation. In addition, it has been found that participants could learn an association between visual and haptic inputs despite the fact that they were spatially separated, resulting in the improvement of typing performance. These results show the limitations of Maximum-Likelihood Estimation model, suggesting that a Bayesian inference model should be used.Dissertation/ThesisDoctoral Dissertation Human Systems Engineering 201

    Do synaesthesia and mental imagery tap into similar cross-modal processes?

    Get PDF
    Synaesthesia has previously been linked with imagery abilities, although an understanding of a causal role for mental imagery in broader synaesthetic experiences remains elusive. This can be partly attributed to our relatively poor understanding of imagery in sensory domains beyond vision. Investigations into the neural and behavioural underpinnings of mental imagery have nevertheless identified an important role for imagery in perception, particularly in mediating cross-modal interactions. However, the phenomenology of synaesthesia gives rise to the assumption that associated cross-modal interactions may be encapsulated and specific to synaesthesia. As such, evidence for a link between imagery and perception may not generalize to synaesthesia. Here, we present results that challenge this idea: first, we found enhanced somatosensory imagery evoked by visual stimuli of body parts in mirror-touch synaesthetes, relative to other synaesthetes or controls. Moreover, this enhanced imagery generalized to tactile object properties not directly linked to their synaesthetic associations. Second, we report evidence that concurrent experience evoked in grapheme-colour synaesthesia was sufficient to trigger visual-to-tactile correspondences that are common to all. Together, these findings show that enhanced mental imagery is a consistent hallmark of synaesthesia, and suggest the intriguing possibility that imagery may facilitate the cross-modal interactions that underpin synaesthesic experiences. This article is part of a discussion meeting issue 'Bridging senses: novel insights from synaesthesia'

    The interaction between motion and texture in the sense of touch

    Get PDF
    Besides providing information on elementary properties of objects, like texture, roughness, and softness, the sense of touch is also important in building a representation of object movement and the movement of our hands. Neural and behavioral studies shed light on the mechanisms and limits of our sense of touch in the perception of texture and motion, and of its role in the control of movement of our hands. The interplay between the geometrical and mechanical properties of the touched objects, such as shape and texture, the movement of the hand exploring the object, and the motion felt by touch, will be discussed in this article. Interestingly, the interaction between motion and textures can generate perceptual illusions in touch. For example, the orientation and the spacing of the texture elements on a static surface induces the illusion of surface motion when we move our hand on it or can elicit the perception of a curved trajectory during sliding, straight hand movements. In this work we present a multiperspective view that encompasses both the perceptual and the motor aspects, as well as the response of peripheral and central nerve structures, to analyze and better understand the complex mechanisms underpinning the tactile representation of texture and motion. Such a better understanding of the spatiotemporal features of the tactile stimulus can reveal novel transdisciplinary applications in neuroscience and haptics

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Electrotactile feedback applications for hand and arm interactions: A systematic review, meta-analysis, and future directions

    Get PDF
    Haptic feedback is critical in a broad range of human-machine/computer-interaction applications. However, the high cost and low portability/wearability of haptic devices remain unresolved issues, severely limiting the adoption of this otherwise promising technology. Electrotactile interfaces have the advantage of being more portable and wearable due to their reduced actuators' size, as well as their lower power consumption and manufacturing cost. The applications of electrotactile feedback have been explored in human-computer interaction and human-machine-interaction for facilitating hand-based interactions in applications such as prosthetics, virtual reality, robotic teleoperation, surface haptics, portable devices, and rehabilitation. This paper presents a technological overview of electrotactile feedback, as well a systematic review and meta-analysis of its applications for hand-based interactions. We discuss the different electrotactile systems according to the type of application. We also discuss over a quantitative congregation of the findings, to offer a high-level overview into the state-of-art and suggest future directions. Electrotactile feedback systems showed increased portability/wearability, and they were successful in rendering and/or augmenting most tactile sensations, eliciting perceptual processes, and improving performance in many scenarios. However, knowledge gaps (e.g., embodiment), technical (e.g., recurrent calibration, electrodes' durability) and methodological (e.g., sample size) drawbacks were detected, which should be addressed in future studies.Comment: 18 pages, 1 table, 8 figures, under review in Transactions on Haptics. This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.Upon acceptance of the article by IEEE, the preprint article will be replaced with the accepted versio

    MetaReality: enhancing tactile experiences using actuated 3D-printed metamaterials in Virtual Reality

    Get PDF
    During interaction with objects in Virtual Reality haptic feedback plays a crucial role for creating convincing immersive experiences. Recent work building upon passive haptic feedback has looked towards fabrication processes for designing and creating proxy objects able to communicate objectsā€™ properties and characteristics. However, such approaches remain limited in terms of scalability as for each material a corresponding object needs to be fabricated. To create more flexible 3D-printed proxies, we explore the potential of metamaterials. To this aim, we designed metamaterial structures able to alter their tactile surface properties, e.g., their hardness and roughness, upon lateral compression. In this work, we designed five different metamaterial patterns based on features that are known to affect tactile properties. We evaluated whether our samples were able to successfully convey different levels of roughness and hardness sensations at varying levels of compression. While we found that roughness was significantly affected by compression state, hardness did not seem to follow the same pattern. In a second study, we focused on two metamaterial patterns showing promise for roughness perception and investigated their visuo-haptic perception in Virtual Reality. Here, eight different compression states of our two selected metamaterials were overlaid with six visual material textures. Our results suggest that, especially at low compression states, our metamaterials were the most promising ones to match the textures displayed to the participants. Additionally, when asked which material participants perceived, adjectives, such as ā€œbrokenā€ and ā€œdamagedā€ were used. This indicates that metamaterial surface textures could be able to simulate different object states. Our results underline that metamaterial design is able to extend the gamut of tactile experiences of 3D-printed surfaces structures, as a single sample is able to reconfigure its haptic sensation through compression

    Advancing proxy-based haptic feedback in virtual reality

    Get PDF
    This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback fĆ¼r Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die FƤhigkeit zuschreibt, Materie kontrollieren zu kƶnnen. Um dieser Vision nƤher zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische EindrĆ¼cke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die fĆ¼r die Realisierung eines ultimativen Displays nƶtige FlexibilitƤt zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunƤchst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. AnschlieƟend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle Ć¼ber haptische EindrĆ¼cke von Nutzern erhalten kƶnnen. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinƤsthetische EindrĆ¼cke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und VerƤnderung des Luftwiderstandes von Requisiten vermitteln. ZusƤtzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken kƶnnen. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. AbschlieƟend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine kƶnnen

    Haptic perception in virtual reality in sighted and blind individuals

    Get PDF
    The incorporation of the sense of touch into virtual reality is an exciting development. However, research into this topic is in its infancy. This experimental programme investigated both the perception of virtual object attributes by touch and the parameters that influence touch perception in virtual reality with a force feedback device called the PHANTOM (TM) (www.sensable.com). The thesis had three main foci. Firstly, it aimed to provide an experimental account of the perception of the attributes of roughness, size and angular extent by touch via the PHANTOM (TM) device. Secondly, it aimed to contribute to the resolution of a number of other issues important in developing an understanding of the parameters that exert an influence on touch in virtual reality. Finally, it aimed to compare touch in virtual reality between sighted and blind individuals. This thesis comprises six experiments. Experiment one examined the perception of the roughness of virtual textures with the PHANTOM (TM) device. The effect of the following factors was addressed: the groove width of the textured stimuli; the endpoint used (stylus or thimble) with the PHANTOM (TM); the specific device used (PHANTOM (TM) vs. IE3000) and the visual status (sighted or blind) of the participants. Experiment two extended the findings of experiment one by addressing the impact of an exploration related factor on perceived roughness, that of the contact force an individual applies to a virtual texture. The interaction between this variable and the factors of groove width, endpoint, and visual status was also addressed. Experiment three examined the perception of the size and angular extent of virtual 3-D objects via the PHANTOM (TM). With respect to the perception of virtual object size, the effect of the following factors was addressed: the size of the object (2.7,3.6,4.5 cm); the type of virtual object (cube vs. sphere); the mode in which the virtual objects were presented; the endpoint used with the PHANTOM (TM) and the visual status of the participants. With respect to the perception of virtual object angular extent, the effect of the following factors was addressed: the angular extent of the object (18,41 and 64Ā°); the endpoint used with the PHANTOM (TM) and the visual status of the participants. Experiment four examined the perception of the size and angular extent of real counterparts to the virtual 3-D objects used in experiment three. Experiment four manipulated the conditions under which participants examined the real objects. Participants were asked to give judgements of object size and angular extent via the deactivated PHANTOM (TM), a stylus probe, a bare index finger and without any constraints on their exploration. In addition to the above exploration type factor, experiment four examined the impact of the same factors on perceived size and angular extent in the real world as had been examined in virtual reality. Experiments five and six examined the consistency of the perception of linear extent across the 3-D axes in virtual space. Both experiments manipulated the following factors: Line extent (2.7,3.6 and 4.5cm); line dimension (x, y and z axis); movement type (active vs. passive movement) and visual status. Experiment six additionally manipulated the direction of movement within the 3-D axes. Perceived roughness was assessed by the method of magnitude estimation. The perceived size and angular extent of the various virtual stimuli and their real counterparts was assessed by the method of magnitude reproduction. This technique was also used to assess perceived extent across the 3-D axes. Touch perception via the PHANTOM (TM) was found to be broadly similar for sighted and blind participants. Touch perception in virtual reality was also found to be broadly similar between two different 3-D force feedback devices (the PHANTOM (TM) and the IE3000). However, the endpoint used with the PHANTOM (TM) device was found to exert significant, but inconsistent effects on the perception of virtual object attributes. Touch perception with the PHANTOM (TM) across the 3-D axes was found to be anisotropic in a similar way to the real world, with the illusion that radial extents were perceived as longer than equivalent tangential extents. The perception of 3-D object size and angular extent was found to be comparable between virtual reality and the real world, particularly under conditions where the participants' exploration of the real objects was constrained to a single point of contact. An intriguing touch illusion, whereby virtual objects explored from the inside were perceived to be larger than the same objects perceived from the outside was found to occur widely in virtual reality, in addition to the real world. This thesis contributes to knowledge of touch perception in virtual reality. The findings have interesting implications for theories of touch perception, both virtual and real

    Multisensory Perception and Learning: Linking Pedagogy, Psychophysics, and Humanā€“Computer Interaction

    Get PDF
    In this review, we discuss how specific sensory channels can mediate the learning of properties of the environment. In recent years, schools have increasingly been using multisensory technology for teaching. However, it still needs to be sufficiently grounded in neuroscientific and pedagogical evidence. Researchers have recently renewed understanding around the role of communication between sensory modalities during development. In the current review, we outline four principles that will aid technological development based on theoretical models of multisensory development and embodiment to foster in-depth, perceptual, and conceptual learning of mathematics. We also discuss how a multidisciplinary approach offers a unique contribution to development of new practical solutions for learning in school. Scientists, engineers, and pedagogical experts offer their interdisciplinary points of view on this topic. At the end of the review, we present our results, showing that one can use multiple sensory inputs and sensorimotor associations in multisensory technology to improve the discrimination of angles, but also possibly for educational purposes. Finally, we present an application, the ā€˜RobotAngleā€™ developed for primary (i.e., elementary) school children, which uses sounds and body movements to learn about angles
    • ā€¦
    corecore