27 research outputs found

    Das Potenzial der frequenzspezifischen Neuromodulation fĂĽr die Untersuchung und Verbesserung der Sprachverarbeitung im Gehirn

    Get PDF
    Das akustische Sprachsignal lässt sich nicht eindeutig in einzelne Wörter unterteilen und ähnelt eher einem kontinuierlichen Lautstrom. Dennoch gibt es zeitliche Regelmäßigkeiten, die möglicherweise bei der Wahrnehmung genutzt werden. Die Dauer von Sprachelementen wie Phonemen, Silben und prosodischen Phrasen spiegelt sich in der Hirnaktivität bei der Sprachwahrnehmung wider. Trotzdem ist nicht klar, ob tatsächlich ein kausaler Zusammenhang zwischen der neuronalen Antwort und der Qualität der Sprachwahrnehmung besteht. Der vorliegende Übersichtsartikel präsentiert einleitend die relevanten Bestandteile des akustischen Sprachsignals, beschreibt die neuronale Antwort auf Sprachreize im Gehirn und zeigt auf, welche neurophysiologischen Veränderungen mit Sprachfunktionsstörungen einhergehen. Es werden verschiedene Verfahren der nichtinvasiven Neuromodulation vorgestellt und erste Ergebnisse präsentiert, welche darauf hindeuten, dass sich durch gezielte Neuromodulation die Sprachfunktionen verbessern lassen können

    The Predictive Value of Individual Electric Field Modeling for Transcranial Alternating Current Stimulation Induced Brain Modulation

    Get PDF
    There is considerable individual variability in the reported effectiveness of non-invasive brain stimulation. This variability has often been ascribed to differences in the neuroanatomy and resulting differences in the induced electric field inside the brain. In this study, we addressed the question whether individual differences in the induced electric field can predict the neurophysiological and behavioral consequences of gamma band tACS. In a within-subject experiment, bi-hemispheric gamma band tACS and sham stimulation was applied in alternating blocks to the participants’ superior temporal lobe, while task-evoked auditory brain activity was measured with concurrent functional magnetic resonance imaging (fMRI) and a dichotic listening task. Gamma tACS was applied with different interhemispheric phase lags. In a recent study, we could show that anti-phase tACS (180° interhemispheric phase lag), but not in-phase tACS (0° interhemispheric phase lag), selectively modulates interhemispheric brain connectivity. Using a T1 structural image of each participant’s brain, an individual simulation of the induced electric field was computed. From these simulations, we derived two predictor variables: maximal strength (average of the 10,000 voxels with largest electric field values) and precision of the electric field (spatial correlation between the electric field and the task evoked brain activity during sham stimulation). We found considerable variability in the individual strength and precision of the electric fields. Importantly, the strength of the electric field over the right hemisphere predicted individual differences of tACS induced brain connectivity changes. Moreover, we found in both hemispheres a statistical trend for the effect of electric field strength on tACS induced BOLD signal changes. In contrast, the precision of the electric field did not predict any neurophysiological measure. Further, neither strength, nor precision predicted interhemispheric integration. In conclusion, we found evidence for the dose-response relationship between individual differences in electric fields and tACS induced activity and connectivity changes in concurrent fMRI. However, the fact that this relationship was stronger in the right hemisphere suggests that the relationship between the electric field parameters, neurophysiology, and behavior may be more complex for bi-hemispheric tACS

    Hemispheric specializations affect interhemispheric speech sound integration during duplex perception

    Full text link
    The present study investigated whether speech-related spectral information benefits from initially predominant right or left hemisphere processing. Normal hearing individuals categorized speech sounds composed of an ambiguous base (perceptually intermediate between /ga/ and /da/), presented to one ear, and a disambiguating low or high F3 chirp presented to the other ear. Shorter response times were found when the chirp was presented to the left ear than to the right ear (inducing initially right-hemisphere chirp processing), but no between-ear differences in strength of overall integration. The results are in line with the assumptions of a right hemispheric dominance for spectral processing

    The modulation of reading strategies by language opacity in early bilinguals: an eye movement study

    Get PDF
    Converging evidences from eye movement experiments indicate that linguistic contexts influence reading strategies. However, the question of whether different linguistic contexts modulate eye movements during reading in the same bilingual individuals remains unresolved. We examined reading strategies in a transparent (German) and an opaque (French) language of early, highly proficient French–German bilinguals: participants read aloud isolated French and German words and pseudo- words while the First Fixation Location (FFL), its duration and latency were measured. Since transparent linguistic contexts and pseudo-words would favour a direct grapheme/phoneme conversion, the reading strategy should be more local for German than for French words (FFL closer to the beginning) and no difference is expected in pseudo-words’ FFL between contexts. Our results confirm these hypotheses, providing the first evidence that the same individuals engage different reading strategy depending on language opacity, suggesting that a given brain process can be modulated by a given context

    Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations

    Get PDF
    Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies.Methods: Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body.Results: Both aphasic patients and healthy controls mainly fixated the speaker's face. We found a significant co-speech gesture Ă— ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction Ă— ROI Ă— group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker's face compared to healthy controls.Conclusion: Co-speech gestures guide the observer's attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker's face

    Multimodal communication in aphasia: perception and production of co-speech gestures during face-to-face conversation

    Get PDF
    The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients’ speech production abilities

    Comprehension of co-speech gestures in aphasic patients: an eye movement study

    Get PDF
    Co-speech gestures are omnipresent and a crucial element of human interaction by facilitating language comprehension. However, it is unclear whether gestures also support language comprehension in aphasic patients. Using visual exploration behavior analysis, the present study aimed to investigate the influence of congruence between speech and co-speech gestures on comprehension in terms of accuracy in a decision task.Method: Twenty aphasic patients and 30 healthy controls watched videos in which speech was either combined with meaningless (baseline condition), congruent, or incongruent gestures. Comprehension was assessed with a decision task, while remote eye-tracking allowed analysis of visual exploration.Results: In aphasic patients, the incongruent condition resulted in a significant decrease of accuracy, while the congruent condition led to a significant increase in accuracy compared to baseline accuracy. In the control group, the incongruent condition resulted in a decrease in accuracy, while the congruent condition did not significantly increase the accuracy. Visual exploration analysis showed that patients fixated significantly less on the face and tended to fixate more on the gesturing hands compared to controls.Conclusion: Co-speech gestures play an important role for aphasic patients as they modulate comprehension. Incongruent gestures evoke significant interference and deteriorate patients’ comprehension. In contrast, congruent gestures enhance comprehension in aphasic patients, which might be valuable for clinical and therapeutic purposes

    Bilateral Gamma/Delta Transcranial Alternating Current Stimulation Affects Interhemispheric Speech Sound Integration

    Full text link
    Perceiving speech requires the integration of different speech cues, that is, formants. When the speech signal is split so that different cues are presented to the right and left ear (dichotic listening), comprehension requires the integration of binaural information. Based on prior electrophysiological evidence, we hypothesized that the integration of dichotically presented speech cues is enabled by interhemispheric phase synchronization between primary and secondary auditory cortex in the gamma frequency band. We tested this hypothesis by applying transcranial alternating current stimulation (TACS) bilaterally above the superior temporal lobe to induce or disrupt interhemispheric gamma-phase coupling. In contrast to initial predictions, we found that gamma TACS applied in-phase above the two hemispheres (interhemispheric lag 0°) perturbs interhemispheric integration of speech cues, possibly because the applied stimulation perturbs an inherent phase lag between the left and right auditory cortex. We also observed this disruptive effect when applying antiphasic delta TACS (interhemispheric lag 180°). We conclude that interhemispheric phase coupling plays a functional role in interhemispheric speech integration. The direction of this effect may depend on the stimulation frequency
    corecore