14 research outputs found
A brief period of postnatal visual deprivation permanently alters visual motion processing in early visual regions.
How does early visual experience shape the development of the visual motion network? To address this question, we used functional magnetic resonance imaging to characterize the brain response elicited by visual motion in adults born with dense bilateral cataract that was treated early in life. Our results suggest that early cataract patients showed reduced recruitment of the early visual areas while processing motion information when compared to matched controls with typical visual development. Interestingly, no alterations were observed in the higher order visual motion area hMT+/V5. Psychophysiological interaction analyses demonstrated reduced interhemispheric connectivity in V1, and reduced connectivity between bilateral hMT+/V5 and V1 during visual motion processing. The altered connectivity profile of V1, but not hMT+/V5, in cataract-reversal patients was confirmed using independent data collected without the subjects being involved in a specific task (resting-state). Altogether these results suggest that a brief and transient period of visual deprivation early in life has a region-specific impact on the visual motion network with V1 being permanently affected while hMT+/V5 shows resilience to deprivation
A brief period of postnatal visual deprivation permanently alters visual motion processing in early visual regions.
How does early visual experience shape the development of the visual motion network? To address this question, we used functional magnetic resonance imaging to characterize the brain response elicited by visual motion in adults born with dense bilateral cataract that was treated early in life. Our results suggest that early cataract patients showed reduced recruitment of the early visual areas while processing motion information when compared to matched controls with typical visual development. Interestingly, no alterations were observed in the higher order visual motion area hMT+/V5. Psychophysiological interaction analyses demonstrated reduced interhemispheric connectivity in V1, and reduced connectivity between bilateral hMT+/V5 and V1 during visual motion processing. The altered connectivity profile of V1, but not hMT+/V5, in cataract-reversal patients was confirmed using independent data collected without the subjects being involved in a specific task (resting-state). Altogether these results suggest that a brief and transient period of visual deprivation early in life has a region-specific impact on the visual motion network with V1 being permanently affected while hMT+/V5 shows resilience to deprivation
High-level neural categorization of human voices as revealed by fast periodic auditory stimulation
Voices are arguably among the most relevant sounds in humans' everyday life and several studies have demonstrated the existence of voice-selective regions in the human brain. However, whether this preference is merely driven by physical (i.e., acoustic) properties specific to voices, or whether it reflects a higher-level categorical response is still under debate. Here, we address this fundamental issue with Fast Periodic Auditory Stimulation combined with electroencephalography (EEG) to measure objective, direct, fast and automatic voice- selective responses in the human brain. Participants were tested with stimulation sequences containing heterogeneous non-vocal sounds from different categories presented at 4 Hz (i.e., 4 stimuli/second), with vocal sounds appearing every 3 stimuli (1.33 Hz). A few minutes of stimulation are sufficient to elicit robust 1.33 Hz voice-selective focal brain responses over superior temporal regions of individual participants. This response is virtually absent for sequences using frequency-scrambled sounds, but is clearly observed when voices are inserted in sounds from musical instruments matched in pitch and harmonicity-to-noise ratio. Overall, our Fast Periodic Auditory Stimulation paradigm demonstrates high-level categorization of human voices, and could be a powerful and versatile tool to understand human auditory categorization in general
An eye-tracking exploration of the alcohol-related attentional bias in severe alcohol use disorder: Influence of subjective craving and cognitive load
Introduction: Dual-process models consider that the attentional bias towards alcohol-related stimuli plays a key role in the persistence of alcohol use disorder. They postulate that this bias is stable and reflects the over-activation of the reflexive/impulsive system, independently of the activity of the reflective/control one. Our study aims to test these assumptions by investigating (1) the influence that the activity of the reflective system might have on alcohol-related attentional bias, and (2) the stability of the bias following the variation of subjective craving. Method: We recruited 60 patients with severe alcohol use disorder (30 with craving for alcohol at testing time, and 30 without craving) and 30 matched healthy controls. Participants performed a free viewing task with images of alcoholic and non-alcoholic beverages. The task was then combined with an auditory selective attention task with two levels of difficulty, mobilizing cognitive resources and thus taping on the reflective system. Eye movements were recorded using an eye-tracking. Results: Our results show an attentional bias towards alcohol-related stimuli in patients with craving and, conversely, an avoidance bias for alcohol in patients without craving. Healthy subjects did not show any bias towards alcohol. The bias remains stable regardless of the difficulty level of the concurrent cognitive task. Discussion: Attentional bias does not appear to be influenced by cognitive load, confirming the independence of the reflexive and reflective systems. Nevertheless, the direction of the bias appears to be strongly influenced by patients' subjective craving, calling into question the stability of the bias predicted by dual-process models
Multi-level reorganization in the temporal dynamics of sound processing in early blind people
Early blindness triggers reorganization in the brain networks that code for sound processing. However, how visual deprivation impacts the temporal dynamics of different stages of auditory discrimination (acoustic to categorical coding) remains mostly unexplored. We characterized the time course of brain activity using electroencephalography (EEG) while congenitally blind (CB) and sighted individuals (SC) listened to 1-second-long sounds belonging to eight categories. Multivariate decoding analyses revealed enhanced sound decoding in the brain of the CB compared to sighted from 167 to 1007 ms after sound onset. Furthermore, the classifier weights transformed and projected on the sensors were enhanced in the CB with the topography evolving along a frontal-posterior axis as the sound unfolded in time. To investigate which format of sound processing were enhanced in CB, we used representational similarity analysis (RSA) with three classes of models for sound representation: (i) the Modulation Transfer Function (MTF) to simulate early stage of acoustic processing, (ii) layers of a deep neural network (YAMNET), (iii) a high-level model based on the categorical membership of sounds and participant specific similarity ratings of each pair of sounds. MTF emerged early in the EEG of each group, peaking at around 200 ms however, no differences were found between the two populations. In contrast, the correlations with the DNN layers displayed distinctions at around 200 ms and post stimulus offset specifically in few a layers representing intermediate acoustic processing. Categorical representation of sounds emerged at around 250 ms in both populations, CB showed an enhanced categorical representation peaking at 550 ms. Furthermore, regression analysis indicated that the DNN layers could uniquely explain the maximum variance in the EEG of both populations early in time while unique contribution of categorical models peaked at 550 ms. The DNN layers representations in the brain were found to be mediated by enhanced theta oscillations in CB. All these results suggest that early blindness triggers a multi-level reorganization in brain networks coding for sounds, with enhanced intermediate-level acoustic discrimination earlier in time, followed by an increased categorical coding later