38 research outputs found

    Combining proprioception and touch to compute spatial information

    Get PDF
    Localising a tactile stimulus in egocentric space involves integrating information from skin receptors with proprioceptive inputs about body posture. We investigated whether body posture automatically influences tactile spatial judgements, even when it is irrelevant to the task. In Experiment 1, participants received two successive tactile stimuli on the forearm and were asked to indicate whether the first or second touch of the pair was closer to an anatomical body landmark, either the wrist or the elbow. The task was administered in three experimental conditions involving different body postures: canonical body posture with extended forearm and hand pointing distally; a non-canonical body posture with forearm and hand pointing vertically up at 90° and a ‘reversed' body posture with the elbow fully flexed at 180°, so that the hand pointed proximally. Thus, our task required localising touch on the skin and then relating skin locations to anatomical body landmarks. Critically, both functions are independent of the posture of the body in space. We nevertheless found reliable effects of body posture: judgement errors increased when the canonical forearm posture was rotated through 180°. These results were further confirmed in Experiment 2, in which stimuli were delivered to the finger. However, additionally reversing the canonical posture of the finger, as well as that of the forearm, so that the finger was restored to its canonical orientation in egocentric space, restored performance to normal levels. Our results confirm an automatic process of localising the body in external space underlying the process of tactile perception. This process appears to involve a combination of proprioceptive and tactile information

    Tool-use reshapes the boundaries of body and peripersonal space representations

    Get PDF
    Interaction with objects in the environment typically requires integrating information concerning the object location with the position and size of body parts. The former information is coded in a multisensory representation of the space around the body, a representation of peripersonal space (PPS), whereas the latter is enabled by an online, constantly updated, action-orientated multisensory representation of the body (BR). Using a tool to act upon relatively distant objects extends PPS representation. This effect has been interpreted as indicating that tools can be incorporated into BR. However, empirical data showing that tool-use simultaneously affects PPS representation and BR are lacking. To study this issue, we assessed the extent of PPS representation by means of an audio-tactile interaction task and BR by means of a tactile distance perception task and a body-landmarks localisation task, before and after using a 1-m-long tool to reach far objects. Tool-use extended the representation of PPS along the tool axis and concurrently shaped BR; after tool-use, subjects perceived their forearm narrower and longer compared to before tool-use, a shape more similar to the one of the tool. Tool-use was necessary to induce these effects, since a pointing task did not affect PPS and BR. These results show that a brief training with a tool induces plastic changes both to the perceived dimensions of the body part acting upon the tool and to the space around it, suggesting a strong overlap between peripersonal space and body representatio

    Amputation and prosthesis implantation shape body and peripersonal space representations

    Get PDF
    Little is known about whether and how multimodal representations of the body (BRs) and of the space around the body (Peripersonal Space, PPS) adapt to amputation and prosthesis implantation. In order to investigate this issue, we tested BR in a group of upper limb amputees by means of a tactile distance perception task and PPS by means of an audio-tactile interaction task. Subjects performed the tasks with stimulation either on the healthy limb or the stump of the amputated limb, while wearing or not wearing their prosthesis. When patients performed the tasks on the amputated limb, without the prosthesis, the perception of arm length shrank, with a concurrent shift of PPS boundaries towards the stump. Conversely, wearing the prosthesis increased the perceived length of the stump and extended the PPS boundaries so as to include the prosthetic hand, such that the prosthesis partially replaced the missing limb.BM

    The wheelchair as a full-body tool extending the peripersonal space

    Get PDF
    Dedicated multisensory mechanisms in the brain represent peripersonal space (PPS), a limited portion of space immediately surrounding the body. Previous studies have illustrated the malleability of PPS representation through hand-object interaction, showing that tool use extends the limits of the hand-centered PPS. In the present study we investigated the effects of a special tool, the wheelchair, in extending the action possibilities of the whole body. We used a behavioral measure to quantify the extension of the PPS around the body before and after Active (Experiment 1) and Passive (Experiment 2) training with a wheelchair and when participants were blindfolded (Experiment 3). Results suggest that a wheelchair-mediated passive exploration of far space extended PPS representation. This effect was specifically related to the possibility of receiving information from the environment through vision, since no extension effect was found when participants were blindfolded. Surprisingly, the active motor training did not induce any modification in PPS representation, probably because the wheelchair maneuver was demanding for non-expert users and thus they may have prioritized processing of information from close to the wheelchair rather than at far spatial locations. Our results suggest that plasticity in PPS representation after tool use seems not to strictly depend on active use of the tool itself, but is triggered by simultaneous processing of information from the body and the space where the body acts in the environment, which is more extended in the case of wheelchair use. These results contribute to our understanding of the mechanisms underlying body environment interaction for developing and improving applications of assistive technological devices in different clinical populations

    Virtual zero gravity impact on internal gravity model

    Get PDF
    This project investigates the impact of a virtual zero gravity experience on the human gravity model. In the planned experiment, subjects are immersed with HMD and full body motion capture in a virtual world exhibiting either normal gravity or the apparent absence of gravity (i.e. body and objects floating in space). The study evaluates changes in the subjects' gravity model by observing changes on motor planning of actions dependent on gravity. Our goal is to demonstrate that a virtual reality exposure can induce some modifications to the humans internal gravity model, analogous to those resulting from real exposure (e.g. parabolic flights), even if users remain under normal gravity condition in reality

    Peripersonal Space: An Index of Multisensory Body–Environment Interactions in Real, Virtual, and Mixed Realities

    Get PDF
    Human–environment interactions normally occur in the physical milieu and thus by medium of the body and within the space immediately adjacent to and surrounding the body, the peripersonal space (PPS). However, human interactions increasingly occur with or within virtual environments, and hence novel approaches and metrics must be developed to index human–environment interactions in virtual reality (VR). Here, we present a multisensory task that measures the spatial extent of human PPS in real, virtual, and augmented realities. We validated it in a mixed reality (MR) ecosystem in which real environment and virtual objects are blended together in order to administer and control visual, auditory, and tactile stimuli in ecologically valid conditions. Within this mixed-reality environment, participants are asked to respond as fast as possible to tactile stimuli on their body, while task-irrelevant visual or audiovisual stimuli approach their body. Results demonstrate that, in analogy with observations derived from monkey electrophysiology and in real environmental surroundings, tactile detection is enhanced when visual or auditory stimuli are close to the body, and not when far from it. We then calculate the location where this multisensory facilitation occurs as a proxy of the boundary of PPS. We observe that mapping of PPS via audiovisual, as opposed to visual alone, looming stimuli results in sigmoidal fits—allowing for the bifurcation between near and far space—with greater goodness of fit. In sum, our approach is able to capture the boundaries of PPS on a spatial continuum, at the individual-subject level, and within a fully controlled and previously laboratory-validated setup, while maintaining the richness and ecological validity of real-life events. The task can therefore be applied to study the properties of PPS in humans and to index the features governing human–environment interactions in virtual or MR. We propose PPS as an ecologically valid and neurophysiologically established metric in the study of the impact of VR and related technologies on society and individuals

    Body part-centered and full body-centered peripersonal space representations

    Get PDF
    Dedicated neural systems represent the space surrounding the body, termed Peripersonal space (PPS), by integrating visual or auditory stimuli occurring near the body with somatosensory information. As a behavioral proxy to PPS, we measured participants' reaction time to tactile stimulation while task-irrelevant auditory or visual stimuli were presented at different distances from their body. In 7 experiments we delineated the critical distance at which auditory or visual stimuli boosted tactile processing on the hand, face, and trunk as a proxy of the PPS extension. Three main findings were obtained. First, the size of PPS varied according to the stimulated body part, being progressively bigger for the hand, then face, and largest for the trunk. Second, while approaching stimuli always modulated tactile processing in a space-dependent manner, receding stimuli did so only for the hand. Finally, the extension of PPS around the hand and the face varied according to their relative positioning and stimuli congruency, whereas the trunk PPS was constant. These results suggest that at least three body-part specific PPS representations exist, differing in extension and directional tuning. These distinct PPS representations, however, are not fully independent from each other, but referenced to the common reference frame of the trunk

    AMYPAD Diagnostic and Patient Management Study:Rationale and design

    Get PDF
    Introduction: Reimbursement of amyloid–positron emission tomography (PET) is lagging due to the lack of definitive evidence on its clinical utility and cost-effectiveness. The Amyloid Imaging to Prevent Alzheimer's Disease–Diagnostic and Patient Management Study (AMYPAD-DPMS) is designed to fill this gap. Methods: AMYPAD-DPMS is a phase 4, multicenter, prospective, randomized controlled study. Nine hundred patients with subjective cognitive decline plus, mild cognitive impairment, and dementia possibly due to Alzheimer's disease will be randomized to ARM1, amyloid-PET performed early in the diagnostic workup; ARM2, amyloid-PET performed after 8 months; and ARM3, amyloid-PET performed whenever the physician chooses to do so. Endpoints: The primary endpoint is the difference between ARM1 and ARM2 in the proportion of patients receiving a very-high-confidence etiologic diagnosis after 3 months. Secondary endpoints address diagnosis and diagnostic confidence, diagnostic/therapeutic management, health economics and patient-related outcomes, and methods for image quantitation. Expected Impacts: AMYPAD-DPMS will supply physicians and health care payers with real-world data to plan management decisions

    Plasticity in body and peripersonal space representations

    Get PDF
    A successful interaction with objects in the environment requires integrating information concerning object-location with the shape, dimension and position of body parts in space. The former information is coded in a multisensory representation of the space around the body, i.e. peripersonal space (PPS), whereas the latter is enabled by an online, constantly updated, action-orientated multisensory representation of the body (BR) that is critical for action. One of the critical features of these representations is that both PPS and BR are not fixed, but they dynamically change depending on different types of experience. In a series of experiment, I studied plastic properties of PPS and BR in humans. I have developed a series of methods to measure the boundaries of PPS representation (Chapter 4), to study its neural correlates (Chapter 3) and to assess BRs. These tasks have been used to study changes in PPS and BR following tool-use (Chapter 5), multisensory stimulation (Chapter 6), amputation and prosthesis implantation (Chapter 7) or social interaction (Chapter 8). I found that changes in the function (tool-use) and the structure (amputation and prosthesis implantation) of the physical body elongate or shrink both PPS and BR. Social context and social interaction also shape PPS representation. Such high degree of plasticity suggests that our sense of body in space is not given at once, but it is constantly constructed and adapted through experience.Allo scopo di interagire con oggetti presenti nell’ambiente esterno è necessario integrare le informazioni sulla posizione degli oggetti nello spazio con informazioni riguardanti la forma, dimensione e posizione delle singole parti del corpo rispetto all’oggetto stesso. Due diverse rappresentazioni supportano la codifica di tali informazioni: da una parte, la rappresentazione dello Spazio Peripersonale, una rappresentazione multisensoriale dello spazio intorno al corpo, e dall’altra una rappresentazione multisensoriale del corpo, costantemente aggiornata e orientata all’azione. Una caratteristica critica di queste rappresentazioni è rappresentata dalle loro proprietà plastiche, cioè dalla possibilità di modificarsi in seguito a diversi tipi di esperienza. In questa tesi mi sono focalizzata sullo studio delle proprietà plastiche delle rappresentazioni del corpo e dello spazio peripersonale. Ho sviluppato una serie di metodi per valutare il confine dello spazio peripersonale (Capitolo 4), per studiare i suoi correlati neurali (Capitolo 3) e per valutare le rappresentazioni multisensoriali del corpo. Questi compiti sono stati usati per studiare modificazioni plastiche del corpo e dello spazio peripersonale in seguito all’utilizzo di uno strumento (Capitolo 5), in seguito a una stimolazione multisensoriale (Capitolo 6), amputazione e impianto di protesi (Capitolo 7) e nell’ambito delle interazioni sociali. I risultati ottenuti hanno mostrato come la modificazione nella funzione (in seguito all’utilizzo di uno strumento) o della struttura fisica (in seguito ad amputazione ed impianto di protesi) del corpo determinano una estensione o una contrazione sia della rappresentazione dello spazio peripersonale che della rappresentazione del corpo. Inoltre, i risultati ottenuti hanno dimostrato che la rappresentazione dello spazio peripersonale viene plasmata anche dalle interazioni sociali. Tale livello di plasticità suggerisce che l’esperienza del nostro corpo viene continuata costruita e aggiornata tramite le diverse esperienze
    corecore