307 research outputs found

    Perceiving Mass in Mixed Reality through Pseudo-Haptic Rendering of Newton's Third Law

    Get PDF
    In mixed reality, real objects can be used to interact with virtual objects. However, unlike in the real world, real objects do not encounter any opposite reaction force when pushing against virtual objects. The lack of reaction force during manipulation prevents users from perceiving the mass of virtual objects. Although this could be addressed by equipping real objects with force-feedback devices, such a solution remains complex and impractical.In this work, we present a technique to produce an illusion of mass without any active force-feedback mechanism. This is achieved by simulating the effects of this reaction force in a purely visual way. A first study demonstrates that our technique indeed allows users to differentiate light virtual objects from heavy virtual objects. In addition, it shows that the illusion is immediately effective, with no prior training. In a second study, we measure the lowest mass difference (JND) that can be perceived with this technique. The effectiveness and ease of implementation of our solution provides an opportunity to enhance mixed reality interaction at no additional cost

    Modulating the Perceived Softness of Real Objects Through Wearable Feel-Through Haptics

    Get PDF
    In vision, Augmented Reality (AR) allows the superposition of digital content on real-world visual information, relying on the well-established See-through paradigm. In the haptic domain, a putative Feel-through wearable device should allow to modify the tactile sensation without masking the actual cutaneous perception of the physical objects. To the best of our knowledge, a similar technology is still far to be effectively implemented. In this work, we present an approach that allows, for the first time, to modulate the perceived softness of real objects using a Feel-through wearable that uses a thin fabric as interaction surface. During the interaction with real objects, the device can modulate the growth of the contact area over the fingerpad without affecting the force experienced by the user, thus modulating the perceived softness. To this aim, the lifting mechanism of our system warps the fabric around the fingerpad in a way proportional to the force exerted on the specimen under exploration. At the same time, the stretching state of the fabric is controlled to keep a loose contact with the fingerpad. We demonstrated that different softness perceptions for the same specimens can be elicited, by suitably controlling the lifting mechanism of the system

    W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality

    Get PDF
    Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction

    Tactile echoes:multisensory augmented reality for the hand

    Get PDF

    Haptic Simulation of Breast Cancer Palpation: A Case Study of Haptic Augmented Reality

    Get PDF
    ABSTRACT Haptic augmented reality (AR) allows to modulate the haptic properties of a real object by providing virtual haptic feedback. We previously developed a haptic AR system wherein the stiffness of a real object can be augmented with the aid of a haptic interface. To demonstrate its potential, this paper presents a case study for medical training of breast cancer palpation. A real breast model made of soft silicone is augmented with a virtual tumor rendered inside. Haptic stimuli for the virtual tumor are generated based on a contact dynamics model identified via real measurements, without the need of geometric information on the breast. A subjective evaluation confirmed the realism and fidelity of our palpation system

    Toward "Pseudo-Haptic Avatars": Modifying the Visual Animation of Self-Avatar Can Simulate the Perception of Weight Lifting

    Get PDF
    International audienceIn this paper we study how the visual animation of a self-avatar can be artificially modified in real-time in order to generate different haptic perceptions. In our experimental setup, participants could watch their self-avatar in a virtual environment in mirror mode while performing a weight lifting task. Users could map their gestures on the self-animated avatar in real-time using a Kinect. We introduce three kinds of modification of the visual animation of the self-avatar according to the effort delivered by the virtual avatar: 1) changes on the spatial mapping between the user's gestures and the avatar, 2) different motion profiles of the animation, and 3) changes in the posture of the avatar (upper-body inclination). The experimental task consisted of a weight lifting task in which participants had to order four virtual dumbbells according to their virtual weight. The user had to lift each virtual dumbbells by means of a tangible stick, the animation of the avatar was modulated according to the virtual weight of the dumbbell. The results showed that the altering the spatial mapping delivered the best performance. Nevertheless, participants globally appreciated all the different visual effects. Our results pave the way to the exploitation of such novel techniques in various VR applications such as sport training, exercise games, or industrial training scenarios in single or collaborative mode

    Creative Haptic Interface Design for the Aging Population

    Get PDF
    Audiovisual human-computer-interfaces still make up the majority of content to the public; however, haptic interfaces offer unique advantage over the dominant information infrastructure, particularly for users with a disability or diminishing cognitive and physical skills like the elderly. The tactile sense allows users to integrate new, unobstructive channels for digital information into their sensorium, one that is less likely to be overwhelmed compared to vision and audition. Haptics research focus on the development of hardware, improving resolution, modality, and fidelity of the actuators. Despite the technological limitations, haptic interfaces are shown to reinforce physical skill acquisition, therapy, and communication. This chapter will present key characteristics intuitive tactile interfaces should capture for elderly end-users; sample projects will showcase unique applications and designs that identify the limitations of the UI

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Tactile echoes:a wearable system for tactile augmentation of objects

    Get PDF
    • 

    corecore