1,446 research outputs found

    Drifting perceptual patterns suggest prediction errors fusion rather than hypothesis selection: replicating the rubber-hand illusion on a robot

    Full text link
    Humans can experience fake body parts as theirs just by simple visuo-tactile synchronous stimulation. This body-illusion is accompanied by a drift in the perception of the real limb towards the fake limb, suggesting an update of body estimation resulting from stimulation. This work compares body limb drifting patterns of human participants, in a rubber hand illusion experiment, with the end-effector estimation displacement of a multisensory robotic arm enabled with predictive processing perception. Results show similar drifting patterns in both human and robot experiments, and they also suggest that the perceptual drift is due to prediction error fusion, rather than hypothesis selection. We present body inference through prediction error minimization as one single process that unites predictive coding and causal inference and that it is responsible for the effects in perception when we are subjected to intermodal sensory perturbations.Comment: Proceedings of the 2018 IEEE International Conference on Development and Learning and Epigenetic Robotic

    Towards a Framework for Embodying Any-Body through Sensory Translation and Proprioceptive Remapping: A Pilot Study

    Get PDF
    We address the problem of physical avatar embodiment and investi- gate the most general factors that may allow a person to “wear” an- other body, different from her own. A general approach is required to exploit the fact that an avatar can have any kind of body. With this pilot study we introduce a conceptual framework for the design of non-anthropomorphic embodiment, to foster immersion and user engagement. The person is interfaced with the avatar, a robot, through a system that induces a divergent internal sensorimotor mapping while controlling the avatar, to create an immersive expe- rience. Together with the conceptual framework, we present two implementations: a prototype tested in the lab and an interactive in- stallation exhibited to general public. These implementations consist of a wheeled robot, and control and sensory feedback systems. The control system includes mechanisms that both detect and resist the user’s movement, increasing the sense of connection with the avatar; the feedback system is a virtual reality (VR) environment represent- ing the avatar’s unique perception, combining sensor and control in- formation to generate visual cues. Data gathered from users indicate that the systems implemented following the proposed framework create a challenging and engaging experience, thus providing solid ground for further developments

    Visual enhancement of touch and the bodily self

    Get PDF
    We experience our own body through both touch and vision. We further see that others’ bodies are similar to our own body, but we have no direct experience of touch on others’ bodies. Therefore, relations between vision and touch are important for the sense of self and for mental representation of one’s own body. For example, seeing the hand improves tactile acuity on the hand, compared to seeing a non-hand object. While several studies have demonstrated this visual enhancement of touch (VET) effect, its relation to the ‘bodily self’, or mental representation of one’s own body remains unclear. We examined whether VET is an effect of seeing a hand, or of seeing my hand, using the rubber hand illusion. In this illusion, a prosthetic hand which is brushed synchronously—but not asynchronously—with one’s own hand is felt to actually be one’s hand. Thus, we manipulated whether or not participants felt like they were looking directly at their hand, while holding the actual stimulus they viewed constant. Tactile acuity was measured by having participants judge the orientation of square-wave gratings. Two characteristic effects of VET were observed: (1) cross-modal enhancement from seeing the hand was inversely related to overall tactile acuity, and (2) participants near sensory threshold showed significant improvement following synchronous stroking, compared to asynchronous stroking or no stroking at all. These results demonstrate a clear functional relation between the bodily self and basic tactile perception

    Multi-destination beaming: apparently being in three places at once through robotic and virtual embodiment

    Get PDF
    It has been shown that an illusion of ownership over an artificial limb or even an entire body can be induced in people through multisensory stimulation, providing evidence that the surrogate body is the person’s actual body. Such body ownership illusions (BOIs) have been shown to occur with virtual bodies, mannequins, and humanoid robots. In this study, we show the possibility of eliciting a full-BOI over not one, but multiple artificial bodies concurrently. We demonstrate this by describing a system that allowed a participant to inhabit and fully control two different humanoid robots located in two distinct places and a virtual body in immersive virtual reality, using real-time full-body tracking and two-way audio communication, thereby giving them the illusion of ownership over each of them. We implemented this by allowing the participant be embodied in any one surrogate body at a given moment and letting them instantaneously switch between them. While the participant was embodied in one of the bodies, a proxy system would track the locations currently unoccupied and would control their remote representation in order to continue performing the tasks in those locations in a logical fashion. To test the efficacy of this system, an exploratory study was carried out with a fully functioning setup with three destinations and a simplified version of the proxy for use in a social interaction. The results indicate that the system was physically and psychologically comfortable and was rated highly by participants in terms of usability. Additionally, feelings of BOI and agency were reported, which were not influenced by the type of body representation. The results provide us with clues regarding BOI with humanoid robots of different dimensions, along with insight about self-localization and multilocation

    Augmenting Immersive Telepresence Experience with a Virtual Body

    Full text link
    We propose augmenting immersive telepresence by adding a virtual body, representing the user's own arm motions, as realized through a head-mounted display and a 360-degree camera. Previous research has shown the effectiveness of having a virtual body in simulated environments; however, research on whether seeing one's own virtual arms increases presence or preference for the user in an immersive telepresence setup is limited. We conducted a study where a host introduced a research lab while participants wore a head-mounted display which allowed them to be telepresent at the host's physical location via a 360-degree camera, either with or without a virtual body. We first conducted a pilot study of 20 participants, followed by a pre-registered 62 participant confirmatory study. Whereas the pilot study showed greater presence and preference when the virtual body was present, the confirmatory study failed to replicate these results, with only behavioral measures suggesting an increase in presence. After analyzing the qualitative data and modeling interactions, we suspect that the quality and style of the virtual arms, and the contrast between animation and video, led to individual differences in reactions to the virtual body which subsequently moderated feelings of presence.Comment: Accepted for publication in Transactions in Visualization and Computer Graphics (TVCG), to be presented in IEEE VR 202

    The embodied user : corporeal awareness & media technology

    Get PDF
    Human beings are proficient users of tools and technology. At times, our interactions with a technological artifact appear so effortless, that the distinction between the artifact and the body starts to fade. When operating anthropomorphically designed teleoperation systems, for example, some people develop the vivid experience that they are physically there at the remote site (i.e., telepresence). Others might even come to sense the slave robot’s arms and hands as their own. The process in which the central nervous system categorizes an object as a part of the body, and in which a discrimination is made between what is contained within and outside the bodily boundaries, is called self-attribution. The aim of this thesis is twofold: (a) To determine the personal factors (e.g., the characteristics of an individual’s psychological makeup) and situational factors (e.g., the appearance of objects) that constrain or facilitate self-attribution, and (b) to determine the degree to which these factors affect people’s experiences with media technology. In Chapter 2, we describe the theoretical framework of our research which is centered on a conception of the user of technology as an embodied agent. In this chapter we distinguish two important, but often confused aspects of embodiment: the body schema, and the body image. The body schema is defined as a dynamic distributed network of procedures aimed at guiding behavior. In contrast, we defined the body image as a part of the process of consciousness and, thus, as consisting of those higher-order discriminations (or qualia) that pertain to the body, and one’s self-perception thereof. To investigate the individual and situational factors that constrain or facilitate selfattribution (i.e., incorporation into the body image), we employ the experimental paradigm of the rubber-hand illusion (Botvinick & Cohen, 1998). In this illusion, which is induced by stroking a person’s concealed hand together with a visible fake one, some people start to sense the fake hand as an actual part of their body. In Chapter 3, we investigate the rubberhand illusion under two mediated conditions: (1) a virtual reality condition, where both the fake hand and its stimulation were projected on the table in front of the participant, and (2) a mixed reality condition, where the fake hand was projected, but its stimulation was unmediated. Our experiment reveals that people can develop the rubber-hand illusion under mediated conditions, but the resulting illusion may, depending on the technology used, be less vivid than in the traditional unmediated setup. In Chapter 4, we investigate the extent to which visual discrepancies between the foreign object and a human hand affect people in developing a vivid rubber-hand illusion. We found that people experience a more vivid illusion when the foreign object resembles the human hand in terms of both shape and texture. Taken together, the experiments in Chapters 3 and 4 support the view that the rubber-hand illusion is not merely governed by a bottom-up process (i.e., based on visuotactile integration), but is affected, top-down, by a cognitive representation of what the human body is like (e.g., Tsakiris and Haggard, 2005). In the rubber-hand illusion, people commonly misperceive the location of their concealed hand toward the direction of the fake hand (Tsakiris & Haggard, 2005). As such, this so-called proprioceptive drift is often used as an alternative to self-reports in assessing the vividness of the illusion (e.g., Tsakiris & Haggard, 2005). In Chapter 5, we investigate the extent to which the observed shift in felt position of the concealed hand can be attributed to experiencing the illusion. For this purpose, we test how various features of the experimental setup of the rubber-hand illusion, which in themselves are not sufficient to elicit the illusion, affect proprioceptive drift. We corroborate existing research which demonstrates that looking at a fake hand or a tabletop for five minutes, in absence of visuotactile stimulation, is sufficient to induce a change in the felt position of an unseen hand (e.g., Gross et al., 1974). Moreover, our experiments indicate that the use of proprioceptive drift as a measure for the strength of the rubber-hand illusion yields different conclusions than an assessment by means of self-reports. Based on these results, we question the validity of proprioceptive drift as an alternative measure of the vividness of the rubber-hand illusion. In Chapter 6, we propose and test a model of the vividness of the rubber-hand illusion. In two experiments, we successfully modeled people’s self-reported experiences related to the illusion (e.g., "the fake hand felt as my own") based on three estimates: (a) a person’s susceptibility for the rubber-hand illusion, (b) the processing demand that is required for a particular experience, and (c) the suppression/constraints imposed by the situation. We demonstrate that the impressions related to the rubber-hand illusion, and by inference the processes behind them, are comparable for different persons. This is a non-trivial finding as such invariance is required for an objective scaling of individual susceptibility and situational impediment on the basis of self-reported experiences. Regarding the validity of our vividness model, we confirm that asynchrony (e.g., Botvinick & Cohen, 1998) and information-poor stimulation (e.g., Armel & Ramachandran, 2003) constrain the development of a vivid rubber-hand illusion. Moreover, we demonstrate that the correlation between a person’s susceptibility for the rubber-hand illusion and the extent of his of her proprioceptive drift is fairly moderate, thereby confirming our conclusions from Chapter 5 regarding the limited validity of proprioceptive drift as a measure of the vividness of the rubber-hand illusion. In Chapter 7, we investigate the extent to which the large individual differences in people’s susceptibility for the illusion can be explained by body image instability, and the ability to engage in motor imagery of the hand (i.e., in mental own hand transformations). In addition, we investigate whether the vividness of the illusion is dependent on the anatomical implausibility of the fake hand’s orientation. With respect to body image instability, we corroborate a small, but significant, correlation between susceptibility and body image aberration scores: As expected, people with a more unstable body image are also more susceptible to the rubber-hand illusion (cf. Burrack & Brugger, 2005). With respect to the position and orientation of the fake hand on the table, we demonstrate that people experience a less vivid rubber-hand illusion when the fake hand is orientated in an anatomically impossible, as compared to an anatomically possible manner. This finding suggests that the attribution of foreign objects to the self is constrained by the morphological capabilities of the human body. With respect to motor imagery, our results indicate a small, but significant, correlation between susceptibility and response times to a speeded left and right hands identification task. In other words, people who are more attuned to engage in mental own hand transformations are also better equipped to develop vivid rubber-hand illusions. In Chapter 8, we examine the role of self-attribution in the experience of telepresence. For this purpose, we introduce the technological domain of mediated social touch (i.e., interpersonal touching over a distance). We anticipated that, compared to a morphologically incongruent input medium, a morphologically congruent medium would be more easily attributed to the self. As a result, we expected our participants to develop a stronger sense of telepresence when they could see their interaction partner performing the touches on a sensor-equipped mannequin as opposed to a touch screen. Our participants, as expected, reported higher levels of telepresence, and demonstrated more physiological arousal with the mannequin input medium. At the same time, our experiment revealed that these effects might not have resulted from self-attribution, and thus that other psychological mechanisms of identification might play a role in telepresence experiences. In Chapter 9, the epilogue, we discuss the main contributions and limitations of this thesis, while taking a broader perspective on the field of research on media technologies and corporeal awareness

    The Sense of embodiment in virtual reality

    Get PDF
    What does it feel like to own, to control, and to be inside a body? The multidimensional nature of this experience together with the continuous presence of one's biological body, render both theoretical and experimental approaches problematic. Nevertheless, exploitation of immersive virtual reality has allowed a reframing of this question to whether it is possible to experience the same sensations towards a virtual body inside an immersive virtual environment as toward the biological body, and if so, to what extent. The current paper addresses these issues by referring to the Sense of Embodiment (SoE). Due to the conceptual confusion around this sense, we provide a working definition which states that SoE consists of three subcomponents: the sense of self-location, the sense of agency, and the sense of body ownership. Under this proposed structure, measures and experimental manipulations reported in the literature are reviewed and related challenges are outlined. Finally, future experimental studies are proposed to overcome those challenges, toward deepening the concept of SoE and enhancing it in virtual applications
    • …
    corecore