23 research outputs found

    Event-related fMRI at 7T reveals overlapping cortical representations for adjacent fingertips in S1 of individual subjects

    Get PDF
    Recent fMRI studies of the human primary somatosensory cortex have been able to differentiate the cortical representations of different fingertips at a single-subject level. These studies did not, however, investigate the expected overlap in cortical activation due to the stimulation of different fingers. Here, we used an event-related design in six subjects at 7 Tesla to explore the overlap in cortical responses elicited in S1 by vibrotactile stimulation of the five fingertips. We found that all parts of S1 show some degree of spatial overlap between the cortical representations of adjacent or even nonadjacent fingertips. In S1, the posterior bank of the central sulcus showed less overlap than regions in the post-central gyrus, which responded to up to five fingertips. The functional properties of these two areas are consistent with the known layout of cytoarchitectonically defined subareas, and we speculate that they correspond to subarea 3b (S1 proper) and subarea 1, respectively. In contrast with previous fMRI studies, however, we did not observe discrete activation clusters that could unequivocally be attributed to different subareas of S1. Venous maps based on T2*-weighted structural images suggest that the observed overlap is not driven by extra-vascular contributions from large vein

    Event-related fMRI at 7T reveals overlapping cortical representations for adjacent fingertips in S1 of individual subjects

    Get PDF
    Recent fMRI studies of the human primary somatosensory cortex have been able to differentiate the cortical representations of different fingertips at a single-subject level. These studies did not, however, investigate the expected overlap in cortical activation due to the stimulation of different fingers. Here, we used an event-related design in six subjects at 7 Tesla to explore the overlap in cortical responses elicited in S1 by vibrotactile stimulation of the five fingertips. We found that all parts of S1 show some degree of spatial overlap between the cortical representations of adjacent or even nonadjacent fingertips. In S1, the posterior bank of the central sulcus showed less overlap than regions in the post-central gyrus, which responded to up to five fingertips. The functional properties of these two areas are consistent with the known layout of cytoarchitectonically defined subareas, and we speculate that they correspond to subarea 3b (S1 proper) and subarea 1, respectively. In contrast with previous fMRI studies, however, we did not observe discrete activation clusters that could unequivocally be attributed to different subareas of S1. Venous maps based on T2*-weighted structural images suggest that the observed overlap is not driven by extra-vascular contributions from large vein

    A probabilistic atlas of finger dominance in the primary somatosensory cortex

    Get PDF
    With the advent of ultra-high field (7T), high spatial resolution functional MRI (fMRI) has allowed the differentiation of the cortical representations of each of the digits at an individual-subject level in human primary somatosensory cortex (S1). Here we generate a probabilistic atlas of the contralateral SI representations of the digits of both the left and right hand in a group of 22 right-handed individuals. The atlas is generated in both volume and surface standardised spaces from somatotopic maps obtained by delivering vibrotactile stimulation to each distal phalangeal digit using a travelling wave paradigm. Metrics quantify the likelihood of a given position being assigned to a digit (full probability map) and the most probable digit for a given spatial location (maximum probability map). The atlas is validated using a leave-one-out cross validation procedure. Anatomical variance across the somatotopic map is also assessed to investigate whether the functional variability across subjects is coupled to structural differences. This probabilistic atlas quantifies the variability in digit representations in healthy subjects, finding some quantifiable separability between digits 2, 3 and 4, a complex overlapping relationship between digits 1 and 2, and little agreement of digit 5 across subjects. The atlas and constituent subject maps are available online for use as a reference in future neuroimaging studies

    Fast Event-Related Mapping of Population Fingertip Tuning Properties in Human Sensorimotor Cortex at 7T

    Get PDF
    fMRI studies that investigate somatotopic tactile representations in the human cortex typically use either block or phase-encoded stimulation designs. Event-related (ER) designs allow for more flexible and unpredictable stimulation sequences than the other methods, but they are less efficient. Here we compared an efficiency-optimized fast ER design (2.8s average intertrial interval, ITI) to a conventional slow ER design (8s average ITI) for mapping voxelwise fingertip tactile tuning properties in the sensorimotor cortex of 6 participants at 7 Tesla. The fast ER design yielded more reliable responses compared to the slow ER design, but with otherwise similar tuning properties. Concatenating the fast and slow ER data, we demonstrate in each individual brain the existence of two separate somatotopically-organized tactile representations of the fingertips, one in the primary somatosensory cortex (S1) on the post-central gyrus, and the other shared across the motor and pre-motor cortices on the pre-central gyrus. In both S1 and motor representations, fingertip selectivity decreased progressively, from narrowly-tuned Brodmann areas 3b and 4a respectively, towards associative parietal and frontal regions that responded equally to all fingertips, suggesting increasing information integration along these two pathways. In addition, fingertip selectivity in S1 decreased from the cortical representation of the thumb to that of the pinky

    Is human auditory cortex organization compatible with the monkey model? Contrary evidence from ultra-high-field functional and structural MRI

    Get PDF
    It is commonly assumed that the human auditory cortex is organized similarly to that of macaque monkeys, where the primary region, or “core,” is elongated parallel to the tonotopic axis (main direction of tonotopic gradients), and subdivided across this axis into up to 3 distinct areas (A1, R, and RT), with separate, mirror-symmetric tonotopic gradients. This assumption, however, has not been tested until now. Here, we used high-resolution ultra-high-field (7 T) magnetic resonance imaging (MRI) to delineate the human core and map tonotopy in 24 individual hemispheres. In each hemisphere, we assessed tonotopic gradients using principled, quantitative analysis methods, and delineated the core using 2 independent (functional and structural) MRI criteria. Our results indicate that, contrary to macaques, the human core is elongated perpendicular rather than parallel to the main tonotopic axis, and that this axis contains no more than 2 mirror-reversed gradients within the core region. Previously suggested homologies between these gradients and areas A1 and R in macaques were not supported. Our findings suggest fundamental differences in auditory cortex organization between humans and macaques

    Neuroanatomical Alterations in Tinnitus Assessed with Magnetic Resonance Imaging

    Get PDF
    Previous studies of anatomical changes associated with tinnitus have provided inconsistent results, with some showing significant cortical and subcortical changes, while others have found effects due to hearing loss, but not tinnitus. In this study, we examined changes in brain anatomy associated with tinnitus using anatomical scans from 128 participants with tinnitus and hearing loss, tinnitus with clinically normal hearing, and non-tinnitus controls with clinically normal hearing. The groups were matched for hearing loss, age and gender. We employed voxel- and surface-based morphometry (SBM) to investigate gray and white matter volume and thickness within regions-of-interest (ROI) that were based on the results of previous studies. The largest overall effects were found for age, gender, and hearing loss. With regard to tinnitus, analysis of ROI revealed numerous small increases and decreases in gray matter and thickness between tinnitus and non-tinnitus controls, in both cortical and subcortical structures. For whole brain analysis, the main tinnitus-related significant clusters were found outside sensory auditory structures. These include a decrease in cortical thickness for the tinnitus group compared to controls in the left superior frontal gyrus (SFG), and a decrease in cortical volume with hearing loss in left Heschl’s gyrus (HG). For masked analysis, we found a decrease in gray matter volume in the right Heschle’s gyrus for the tinnitus group compared to the controls. We found no changes in the subcallosal region as reported in some previous studies. Overall, while some of the morphological differences observed in this study are similar to previously published findings, others are entirely different or even contradict previous results. We highlight other discrepancies among previous results and the increasing need for a more precise subtyping of the condition

    The effect of long-term unilateral deafness on the activation pattern in the auditory cortices of French-native speakers: influence of deafness side

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In normal-hearing subjects, monaural stimulation produces a normal pattern of asynchrony and asymmetry over the auditory cortices in favour of the contralateral temporal lobe. While late onset unilateral deafness has been reported to change this pattern, the exact influence of the side of deafness on central auditory plasticity still remains unclear. The present study aimed at assessing whether left-sided and right-sided deafness had differential effects on the characteristics of neurophysiological responses over auditory areas. Eighteen unilaterally deaf and 16 normal hearing right-handed subjects participated. All unilaterally deaf subjects had post-lingual deafness. Long latency auditory evoked potentials (late-AEPs) were elicited by two types of stimuli, non-speech (1 kHz tone-burst) and speech-sounds (voiceless syllable/pa/) delivered to the intact ear at 50 dB SL. The latencies and amplitudes of the early exogenous components (N100 and P150) were measured using temporal scalp electrodes.</p> <p>Results</p> <p>Subjects with left-sided deafness showed major neurophysiological changes, in the form of a more symmetrical activation pattern over auditory areas in response to non-speech sound and even a significant reversal of the activation pattern in favour of the cortex ipsilateral to the stimulation in response to speech sound. This was observed not only for AEP amplitudes but also for AEP time course. In contrast, no significant changes were reported for late-AEP responses in subjects with right-sided deafness.</p> <p>Conclusion</p> <p>The results show that cortical reorganization induced by unilateral deafness mainly occurs in subjects with left-sided deafness. This suggests that anatomical and functional plastic changes are more likely to occur in the right than in the left auditory cortex. The possible perceptual correlates of such neurophysiological changes are discussed.</p

    Interactions audiovisuelles dans le cortex auditif chez l'homme : approches électrophysiologique et comportementale.

    No full text
    Audiovisual interactions taking place during the perception of audiovisual events were explored in two mainly auditory phenomena: speech perception and representation in auditory sensory memory (ASM), by means of behavioural and electrophysiological measures.Concerning speech perception, we have shown that seeing lip mouvements can speed up auditory speech processing in a phonological discrimination task in normal hearing conditions. This behavioral facilitation was associated with both an activation of the (mainly secondary) auditory cortices by lip movements, as shown in intracranial ERPs in epileptic patients, and a decrease of auditory activity within 50-200 ms after sound onset, as shown both in intracranial ERPs in patients and surface ERPs in normal subjects. We have also shown that a behavioural facilitation can be observed even if the visual speech cues carry temporal, but no phonetic, information, but only in noisy environments.Concerning ASM, we have shown that the detection of a rare audiovisual event delivered in a sequence of standard events is faster than that of either visual or auditory events. This facilitation was associated with interactions of the auditory and visual detection processes, based on the existence of auditory and visual sensory memory traces, as indexed by auditory and visual MMNs of the scalp ERPs. We also showed that the representation of an audiovisual event in ASM, as indexed by the auditory MMN, differs from that of its purely auditory component, but only if the unimodal components of this event are regularly associated. However, we failed to show that a representation of such a regularity can trigger an MMN to its violation.Nous avons étudié, par des mesures comportementales et électrophysiologiques (PE), les interactions audiovisuelles (AV) mises en jeu dans deux types de processus essentiellement auditifs : la perception de la parole, et les représentations en mémoire sensorielle auditive (MSA).Concernant la parole, nous avons montré que la vision des mouvements labiaux accélère la discrimination phonologique de syllabes. Cette facilitation comportementale était associée à la fois à une activation des cortex auditifs (surtout secondaires) par les mouvements de lèvres, visible sur les PE intracérébraux chez le patient épileptique, et à une diminution de l'activité auditive entre 50 et 200 ms après le début du son, visible à la fois en intracérébral chez le patient et en PE de surface chez le sujet normal. Une autre étude comportementale a montré qu'une facilitation peut aussi être observée si les mouvements labiaux ne fournissent que des informations temporelles et non phonétiques, mais seulement dans le bruit.Concernant la MSA, on a montré que la détection d'un évènement AV rare dans une suite d'évènements standards est plus rapide que la détection d'un évènement auditif ou visuel. Cette facilitation serait liée à des interactions entre les traces mnésiques auditives et visuelles indexées par les MMN auditives et visuelles des PE. Nous avons aussi montré, par l'analyse de la MMN auditive, que la représentation d'un évènement AV en MSA diffère de celle de sa composante auditive seule, mais seulement si ses composantes unimodales sont régulièrement associées. En revanche, nous avons échoué à montrer que la représentation d'une telle régularité peut générer une MMN lorsqu'elle est violée

    Interactions audiovisuelles dans le cortex auditif chez l'homme (approches électrophysiologique et comportementale)

    No full text
    Nous avons étudié, par des mesures comportementales et électrophysiologiques (PE), les interactions audiovisuelles (AV) mises en jeu dans deux types de processus essentiellement auditifs : la perception de la parole, et les représentations en mémoire sensorielle auditive (MSA). Concernant la parole, nous avons montré que la vision des mouvements labiaux accélère la discrimination phonologique de syllabes. Cette facilitation comportementale était associée à la fois à une activation des cortex auditifs (surtout secondaires) par les mouvements de lèvres, visible sur les PE intracérébraux chez le patient épileptique, et à une diminution de l'activité auditive entre 50 et 200 ms après le début du son, visible à la fois en intracérébral chez le patient et en PE de surface chez le sujet normal. Une autre étude comportementale a montré qu'une facilitation peut aussi être observée si les mouvements labiaux ne fournissent que des informations temporelles et non phonétiques, mais seulement dans le bruit. Concernant la MSA, on a montré que la détection d un évènement AV rare dans une suite d'évènements standards est plus rapide que la détection d'un évènement auditif ou visuel. Cette facilitation serait liée à des interactions entre les traces mnésiques auditives et visuelles indexées par les MMN auditives et visuelles des PE. Nous avons aussi montré, par l'analyse de la MMN auditive, que la représentation d'un évènement AV en MSA diffère de celle de sa composante auditive seule, mais seulement si ses composantes unimodales sont régulièrement associées. En revanche, nous avons échoué à montrer que la représentation d'une telle régularité peut générer une MMN lorsqu elle est violée.LYON2/BRON-BU (690292101) / SudocSudocFranceF

    Is the auditory sensory memory sensitive to visual information?

    No full text
    The mismatch negativity (MMN) component of auditory event-related brain potentials can be used as a probe to study the representation of sounds in auditory sensory memory (ASM). Yet it has been shown that an auditory MMN can also be elicited by an illusory auditory deviance induced by visual changes. This suggests that some visual information may be encoded in ASM and is accessible to the auditory MMN process. It is not known, however, whether visual information affects ASM representation for any audiovisual event or whether this phenomenon is limited to specific domains in which strong audiovisual illusions occur. To highlight this issue, we have compared the topographies of MMNs elicited by non-speech audiovisual stimuli deviating from audiovisual standards on the visual, the auditory, or both dimensions. Contrary to what occurs with audiovisual illusions, each unimodal deviant elicited sensory-specific MMNs, and the MMN to audiovisual deviants included both sensory components. The visual MMN was, however, different from a genuine visual MMN obtained in a visual-only control oddball paradigm, suggesting that auditory and visual information interacts before the MMN process occurs. Furthermore, the MMN to audiovisual deviants was significantly different from the sum of the two sensory-specific MMNs, showing that the processes of visual and auditory change detection are not completely independent
    corecore