28 research outputs found

    The influence of the viewpoint in a self-avatar on body part and self-localization

    Get PDF
    The goal of this study is to determine how a self-avatar in virtual reality, experienced from different viewpoints on the body (at eye- or chest-height), might influence body part localization, as well as self-localization within the body. Previous literature shows that people do not locate themselves in only one location, but rather primarily in the face and the upper torso. Therefore, we aimed to determine if manipulating the viewpoint to either the height of the eyes or to the height of the chest would influence self-location estimates towards these commonly identified locations of self. In a virtual reality (VR) headset, participants were asked to point at sev- eral of their body parts (body part localization) as well as "directly at you" (self-localization) with a virtual pointer. Both pointing tasks were performed before and after a self-avatar adaptation phase where participants explored a co-located, scaled, gender-matched, and animated self-avatar. We hypothesized that experiencing a self-avatar might reduce inaccuracies in body part localization, and that viewpoint would influence pointing responses for both body part and self-localization. Participants overall pointed relatively accurately to some of their body parts (shoulders, chin, and eyes), but very inaccurately to others, with large undershooting for the hips, knees, and feet, and large overshooting for the top of the head. Self-localization was spread across the body (as well as above the head) with the following distribution: the upper face (25%), the up- per torso (25%), above the head (15%) and below the torso (12%). We only found an influence of viewpoint (eye- vs chest-height) during the self-avatar adaptation phase for body part localization and not for self-localization. The overall change in error distance for body part localization for the viewpoint at eye-height was small (M = –2.8 cm), while the overall change in error distance for the viewpoint at chest-height was significantly larger, and in the upwards direction relative to the body parts (M = 21.1 cm). In a post-questionnaire, there was no significant difference in embodiment scores between the viewpoint conditions. Most interestingly, having a self-avatar did not change the results on the self-localization pointing task, even with a novel viewpoint (chest-height). Possibly, body-based cues, or memory, ground the self when in VR. However, the present results caution the use of altered viewpoints in applications where veridical position sense of body parts is required

    Embodying the mind and representing the body

    Get PDF
    Does the existence of body representations undermine the explanatory role of the body? Or do certain types of representation depend so closely upon the body that their involvement in a cognitive task implicates the body itself? In the introduction of this special issue we explore lines of tension and complement that might hold between the notions of embodiment and body representations, which remain too often neglected or obscure. To do so, we distinguish two conceptions of embodiment that either put weight on the explanatory role of the body itself or body representations. We further analyse how and to what extent body representations can be said to be embodied. Finally, we give an overview of the full volume articulated around foundational issues (How should we define the notion of embodiment? To what extent and in what sense is embodiment compatible with representationalism? To what extent and in what sense are sensorimotor approaches similar to behaviourism?) and their applications in several cognitive domains (perception, concepts, selfhood, social cognition)

    Trait phenomenological control predicts experience of mirror synaesthesia and the rubber hand illusion

    Get PDF
    In hypnotic responding, expectancies arising from imaginative suggestion drive striking experiential changes (e.g., hallucinations) — which are experienced as involuntary — according to a normally distributed and stable trait ability (hypnotisability). Such experiences can be triggered by implicit suggestion and occur outside the hypnotic context. In large sample studies (of 156, 404 and 353 participants), we report substantial relationships between hypnotisability and experimental measures of experiential change in mirror-sensory synaesthesia and the rubber hand illusion comparable to relationships between hypnotisability and individual hypnosis scale items. The control of phenomenology to meet expectancies arising from perceived task requirements can account for experiential change in psychological experiments

    The vestibular system modulates the contributions of head and torso to egocentric spatial judgements

    Get PDF
    Egocentric representations allow us to describe the external world as experienced from an individual’s bodily location. We recently developed a novel method of quantifying the weight given to different body parts in egocentric judgments (the Misalignment Paradigm). We found that both head and torso contribute to simple alter-egocentric spatial judgments. We hypothesised that artificial stimulation of the vestibular system would provide a head-related signal, which might affect the weighting given to the head in egocentric spatial judgments. Bipolar Galvanic Vestibular Stimulation (GVS) was applied during the Misalignment Paradigm. A sham stimulation condition was also included to control for non-specific effects. Our data show that the weight given to the head was increased during left anodal and right cathodal GVS, compared to the opposite GVS polarity and sham stimulation. That is, the polarity of GVS, which preferentially activates vestibular areas in the right cerebral hemisphere, influenced the relative weightings of head and torso in spatial judgments

    Spatial content of painful sensations

    Get PDF
    Philosophical considerations regarding experiential spatial content have focused on exteroceptive sensations presenting external entities, and not on interoceptive experiences that present states of our own body. A notable example is studies on interoceptive touch, in which it is argued that interoceptive tactile experiences have rich spatial content such that tactile sensations are experienced as located in a spatial field. This article investigates whether a similarly rich spatial content can be attributed to experiences of acute, cutaneous pain. It is argued that such experiences of pain do not have field-like content, as they do not present distance relations between painful sensations

    Where exactly am I? Self-location judgements distribute between head and torso

    No full text
    I am clearly located where my body is located. But is there one particular place inside my body where I am? Recent results have provided apparently contradictory findings about this question. Here, we addressed this issue using a more direct approach than has been used in previous studies. Using a simple pointing task, we asked participants to point directly at themselves, either by manual manipulation of the pointer whilst blindfolded or by visually discerning when the pointer was in the correct position. Self-location judgements in haptic and visual modalities were highly similar, and were clearly modulated by the starting location of the pointer. Participants most frequently chose to point to one of two likely regions, the upper face or the upper torso, according to which they reached first. These results suggest that while the experienced self is not spread out homogeneously across the entire body, nor is it localised in any single point. Rather, two distinct regions, the upper face and upper torso, appear to be judged as where “I” am

    Tool use

    No full text
    Tool use is a defining feature of the human species. Tool-in-hand, humans can manipulate their environment in novel ways as well as process the sensory information arising during mechanical contact. This fact has led cognitive neuroscience researchers over the last few decades to explore how tool-based body augmentation impacts the behavior and neural processing of their users. In this chapter, we explore the findings of this research and draw some tentative conclusions about its implications for the sensorimotor system. We first survey research, conducted over the past decade, that has found that using a tool modulates body representations underlying action and somatosensory perception. This research includes both behavioral and neural approaches. We then focus on the sensory and functional underpinnings driving the tool-induced plasticity to body representations. Finally, we describe a novel means to evaluate the incorporation of tools into the sensorimotor system: The ability of humans to sense the locations of external objects with a tool. We will discuss how this research has already gained powerful insight into the relationship between tools and the body, as well as challenging our conception of the boundaries of the somatosensory system

    Where am I in virtual reality?

    No full text

    Where am I in virtual reality?

    No full text
    It is currently not well understood whether people experience themselves to be located in one or more specific part(s) of their body. Virtual reality (VR) is increasingly used as a tool to study aspects of bodily perception and self-consciousness, due to its strong experimental control and ease in manipulating multi-sensory aspects of bodily experience. To investigate where people self-locate in their body within virtual reality, we asked participants to point directly at themselves with a virtual pointer, in a VR headset. In previous work employing a physical pointer, participants mainly located themselves in the upper face and upper torso. In this study, using a VR headset, participants mainly located themselves in the upper face. In an additional body template task where participants pointed at themselves on a picture of a simple body outline, participants pointed most often to the upper torso, followed by the (upper) face. These results raise the question as to whether head-mounted virtual reality might alter where people locate themselves making them more “head-centred”
    corecore