37 research outputs found

    Towards explaining spatial touch perception: Weighted integration of multiple location codes

    Get PDF
    Badde S, Heed T. Towards explaining spatial touch perception: Weighted integration of multiple location codes. Cognitive Neuropsychology. 2016;33(1-2):26-47

    Feeling a Touch to the Hand on the Foot

    Get PDF
    Badde S, Röder B, Heed T. Feeling a Touch to the Hand on the Foot. Current Biology. 2019;29(9):1-7.Where we perceive a touch putatively depends on topographic maps that code the touch’s location on the skin [1] as well as its position in external space [2, 3, 4, 5]. However, neither somatotopic nor external-spatial representations can account for atypical tactile percepts in some neurological patients and amputees; referral of touch to an absent or anaesthetized hand after stimulation of a foot [6, 7] or the contralateral hand [8, 9, 10] challenges the role of topographic representations when attributing touch to the limbs. Here, we show that even healthy adults systematically misattribute touch to other limbs. Participants received two tactile stimuli, each to a different limb—hand or foot—and reported which of all four limbs had been stimulated first. Hands and feet were either uncrossed or crossed to dissociate body-based and external-spatial representations [11, 12, 13, 14]. Remarkably, participants regularly attributed the first touch to a limb that had received neither of the two stimuli. The erroneously reported, non-stimulated limb typically matched the correct limb with respect to limb type or body side. Touch was misattributed to non-stimulated limbs of the other limb type and body side only if they were placed at the correct limb’s canonical (default) side of space. The touch’s actual location in external space was irrelevant. These errors replicated across several contexts, and modeling linked them to incoming sensory evidence rather than to decision strategies. The results highlight the importance of the touched body part’s identity and canonical location but challenge the role of external-spatial tactile representations when attributing touch to a limb

    Disentangling the External Reference Frames Relevant to Tactile Localization

    Get PDF
    Heed T, Backhaus J, Röder B, Badde S. Disentangling the External Reference Frames Relevant to Tactile Localization. PLOS ONE. 2016;11(7): e0158829

    Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults

    Get PDF
    Schubert JTW, Badde S, Röder B, Heed T. Task demands affect spatial reference frame weighting during tactile localization in sighted and congenitally blind adults. PLOS ONE. 2017;12(12): e0189067.Task demands modulate tactile localization in sighted humans, presumably through weight adjustments in the spatial integration of anatomical, skin-based, and external, posture-based information. In contrast, previous studies have suggested that congenitally blind humans, by default, refrain from automatic spatial integration and localize touch using only skin-based information. Here, sighted and congenitally blind participants localized tactile targets on the palm or back of one hand, while ignoring simultaneous tactile distractors at congruent or incongruent locations on the other hand. We probed the interplay of anatomical and external location codes for spatial congruency effects by varying hand posture: the palms either both faced down, or one faced down and one up. In the latter posture, externally congruent target and distractor locations were anatomically incongruent and vice versa. Target locations had to be reported either anatomically (“palm” or “back” of the hand), or externally (“up” or “down” in space). Under anatomical instructions, performance was more accurate for anatomically congruent than incongruent target-distractor pairs. In contrast, under external instructions, performance was more accurate for externally congruent than incongruent pairs. These modulations were evident in sighted and blind individuals. Notably, distractor effects were overall far smaller in blind than in sighted participants, despite comparable target-distractor identification performance. Thus, the absence of developmental vision seems to be associated with an increased ability to focus tactile attention towards a non-spatially defined target. Nevertheless, that blind individuals exhibited effects of hand posture and task instructions in their congruency effects suggests that, like the sighted, they automatically integrate anatomical and external information during tactile localization. Moreover, spatial integration in tactile processing is, thus, flexibly adapted by top-down information—here, task instruction—even in the absence of developmental vision

    A touch of hierarchy: population receptive fields reveal fingertip integration in Brodmann areas in human primary somatosensory cortex

    Get PDF
    Several neuroimaging studies have shown the somatotopy of body part representations in primary somatosensory cortex (S1), but the functional hierarchy of distinct subregions in human S1 has not been adequately addressed. The current study investigates the functional hierarchy of cyto-architectonically distinct regions, Brodmann areas BA3, BA1, and BA2, in human S1. During functional MRI experiments, we presented participants with vibrotactile stimulation of the fingertips at three different vibration frequencies. Using population Receptive Field (pRF) modeling of the fMRI BOLD activity, we identified the hand region in S1 and the somatotopy of the fingertips. For each voxel, the pRF center indicates the finger that most effectively drives the BOLD signal, and the pRF size measures the spatial somatic pooling of fingertips. We find a systematic relationship of pRF sizes from lower-order areas to higher-order areas. Specifically, we found that pRF sizes are smallest in BA3, increase slightly towards BA1, and are largest in BA2, paralleling the increase in visual receptive field size as one ascends the visual hierarchy. Additionally, we find that the time-to-peak of the hemodynamic response in BA3 is roughly 0.5 s earlier compared to BA1 and BA2, further supporting the notion of a functional hierarchy of subregions in S1. These results were obtained during stimulation of different mechanoreceptors, suggesting that different afferent fibers leading up to S1 feed into the same cortical hierarchy

    Exposure to 16 h of normobaric hypoxia induces ionic edema in the healthy brain

    Get PDF
    Following prolonged exposure to hypoxic conditions, for example, due to ascent to high altitude, stroke, or traumatic brain injury, cerebral edema can develop. The exact nature and genesis of hypoxia-induced edema in healthy individuals remain unresolved. We examined the effects of prolonged, normobaric hypoxia, induced by 16 h of exposure to simulated high altitude, on healthy brains using proton, dynamic contrast enhanced, and sodium MRI. This dual approach allowed us to directly measure key factors in the development of hypoxia-induced brain edema: (1) Sodium signals as a surrogate of the distribution of electrolytes within the cerebral tissue and (2) Ktrans as a marker of blood–brain–barrier integrity. The measurements point toward an accumulation of sodium ions in extra- but not in intracellular space in combination with an intact endothelium. Both findings in combination are indicative of ionic extracellular edema, a subtype of cerebral edema that was only recently specified as an intermittent, yet distinct stage between cytotoxic and vasogenic edemas. In sum, here a combination of imaging techniques demonstrates the development of ionic edemas following prolonged normobaric hypoxia in agreement with cascadic models of edema formation

    Oculomotor freezing and tactile temporal expectation

    No full text

    Ionic Edema in the Healthy Brain after Hypoxic Exposure

    No full text
    corecore