1,646 research outputs found

    Visual cortex responses reflect temporal structure of continuous quasi-rhythmic sensory stimulation

    Get PDF
    Neural processing of dynamic continuous visual input, and cognitive influences thereon, are frequently studied in paradigms employing strictly rhythmic stimulation. However, the temporal structure of natural stimuli is hardly ever fully rhythmic but possesses certain spectral bandwidths (e.g. lip movements in speech, gestures). Examining periodic brain responses elicited by strictly rhythmic stimulation might thus represent ideal, yet isolated cases. Here, we tested how the visual system reflects quasi-rhythmic stimulation with frequencies continuously varying within ranges of classical theta (4–7Hz), alpha (8–13Hz) and beta bands (14–20Hz) using EEG. Our findings substantiate a systematic and sustained neural phase-locking to stimulation in all three frequency ranges. Further, we found that allocation of spatial attention enhances EEG-stimulus locking to theta- and alpha-band stimulation. Our results bridge recent findings regarding phase locking (“entrainment”) to quasi-rhythmic visual input and “frequency-tagging” experiments employing strictly rhythmic stimulation. We propose that sustained EEG-stimulus locking can be considered as a continuous neural signature of processing dynamic sensory input in early visual cortices. Accordingly, EEG-stimulus locking serves to trace the temporal evolution of rhythmic as well as quasi-rhythmic visual input and is subject to attentional bias

    A new unifying account of the roles of neuronal entrainment

    Get PDF
    Rhythms are a fundamental and defining feature of neuronal activity in animals including humans. This rhythmic brain activity interacts in complex ways with rhythms in the internal and external environment through the phenomenon of ‘neuronal entrainment’, which is attracting increasing attention due to its suggested role in a multitude of sensory and cognitive processes. Some senses, such as touch and vision, sample the environment rhythmically, while others, like audition, are faced with mostly rhythmic inputs. Entrainment couples rhythmic brain activity to external and internal rhythmic events, serving fine-grained routing and modulation of external and internal signals across multiple spatial and temporal hierarchies. This interaction between a brain and its environment can be experimentally investigated and even modified by rhythmic sensory stimuli or invasive and non-invasive neuromodulation techniques. We provide a comprehensive overview of the topic and propose a theoretical framework of how neuronal entrainment dynamically structures information from incoming neuronal, bodily and environmental sources. We discuss the different types of neuronal entrainment, the conceptual advances in the field, and converging evidence for general principles

    Rhythmically modulating neural entrainment during exposure to regularities influences statistical learning

    Get PDF
    The ability to discover regularities in the environment, such as syllable patterns in speech, is known as statistical learning. Previous studies have shown that statistical learning is accompanied by neural entrainment, in which neural activity temporally aligns with repeating patterns over time. However, it is unclear whether these rhythmic neural dynamics play a functional role in statistical learning, or whether they largely reflect the downstream consequences of learning, such as the enhanced perception of learned words in speech. To better understand this issue, we manipulated participants’ neural entrainment during statistical learning using continuous rhythmic visual stimulation. Participants were exposed to a speech stream of repeating nonsense words while viewing either (1) a visual stimulus with a “congruent” rhythm that aligned with the word structure, (2) a visual stimulus with an incongruent rhythm, or (3) a static visual stimulus. Statistical learning was subsequently measured using both an explicit and implicit test. Participants in the congruent condition showed a significant increase in neural entrainment over auditory regions at the relevant word frequency, over and above effects of passive volume conduction, indicating that visual stimulation successfully altered neural entrainment within relevant neural substrates. Critically, during the subsequent implicit test, participants in the congruent condition showed an enhanced ability to predict upcoming syllables and stronger neural phase synchronization to component words, suggesting that they had gained greater sensitivity to the statistical structure of the speech stream relative to the incongruent and static groups. This learning benefit could not be attributed to strategic processes, as participants were largely unaware of the contingencies between the visual stimulation and embedded words. These results indicate that manipulating neural entrainment during exposure to regularities influences statistical learning outcomes, suggesting that neural entrainment may functionally contribute to statistical learning. Our findings encourage future studies using non-invasive brain stimulation methods to further understand the role of entrainment in statistical learning

    On natural attunement:Shared rhythms between the brain and the environment

    Get PDF
    Rhythms exist both in the embodied brain and the built environment. Becoming attuned to the rhythms of the environment, such as repetitive columns, can greatly affect perception. Here, we explore how the built environment affects human cognition and behavior through the concept of natural attunement, often resulting from the coordination of a person's sensory and motor systems with the rhythmic elements of the environment. We argue that the built environment should not be reduced to mere states, representations, and single variables but instead be considered a bundle of highly related continuous signals with which we can resonate. Resonance and entrainment are dynamic processes observed when intrinsic frequencies of the oscillatory brain are influenced by the oscillations of an external signal. This allows visual rhythmic stimulations of the environment to affect the brain and body through neural entrainment, cross-frequency coupling, and phase resetting. We review how real-world architectural settings can affect neural dynamics, cognitive processes, and behavior in people, suggesting the crucial role of everyday rhythms in the brain-body-environment relationship

    Spatial attention enhances cortical tracking of quasi-rhythmic visual stimuli

    Get PDF
    Successfully interpreting and navigating our natural visual environment requires us to track its dynamics constantly. Additionally, we focus our attention on behaviorally relevant stimuli to enhance their neural processing. Little is known, however, about how sustained attention affects the ongoing tracking of stimuli with rich natural temporal dynamics. Here, we used MRI-informed source reconstructions of magnetoencephalography (MEG) data to map to what extent various cortical areas track concurrent continuous quasi-rhythmic visual stimulation. Further, we tested how top-down visuo-spatial attention influences this tracking process. Our bilaterally presented quasi-rhythmic stimuli covered a dynamic range of 4–20 Hz, subdivided into three distinct bands. As an experimental control, we also included strictly rhythmic stimulation (10 vs 12 Hz). Using a spectral measure of brain-stimulus coupling, we were able to track the neural processing of left vs. right stimuli independently, even while fluctuating within the same frequency range. The fidelity of neural tracking depended on the stimulation frequencies, decreasing for higher frequency bands. Both attended and non-attended stimuli were tracked beyond early visual cortices, in ventral and dorsal streams depending on the stimulus frequency. In general, tracking improved with the deployment of visuo-spatial attention to the stimulus location. Our results provide new insights into how human visual cortices process concurrent dynamic stimuli and provide a potential mechanism – namely increasing the temporal precision of tracking – for boosting the neural representation of attended input

    Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features

    Get PDF
    During online speech processing, our brain tracks the acoustic fluctuations in speech at different timescales. Previous research has focused on generic timescales (for example, delta or theta bands) that are assumed to map onto linguistic features such as prosody or syllables. However, given the high intersubject variability in speaking patterns, such a generic association between the timescales of brain activity and speech properties can be ambiguous. Here, we analyse speech tracking in source-localised magnetoencephalographic data by directly focusing on timescales extracted from statistical regularities in our speech material. This revealed widespread significant tracking at the timescales of phrases (0.6–1.3 Hz), words (1.8–3 Hz), syllables (2.8–4.8 Hz), and phonemes (8–12.4 Hz). Importantly, when examining its perceptual relevance, we found stronger tracking for correctly comprehended trials in the left premotor (PM) cortex at the phrasal scale as well as in left middle temporal cortex at the word scale. Control analyses using generic bands confirmed that these effects were specific to the speech regularities in our stimuli. Furthermore, we found that the phase at the phrasal timescale coupled to power at beta frequency (13–30 Hz) in motor areas. This cross-frequency coupling presumably reflects top-down temporal prediction in ongoing speech perception. Together, our results reveal specific functional and perceptually relevant roles of distinct tracking and cross-frequency processes along the auditory–motor pathway

    From locomotion to dance and back : exploring rhythmic sensorimotor synchronization

    Full text link
    Le rythme est un aspect important du mouvement et de la perception de l’environnement. Lorsque l’on danse, la pulsation musicale induit une activitĂ© neurale oscillatoire qui permet au systĂšme nerveux d’anticiper les Ă©vĂšnements musicaux Ă  venir. Le systĂšme moteur peut alors s’y synchroniser. Cette thĂšse dĂ©veloppe de nouvelles techniques d’investigation des rythmes neuraux non strictement pĂ©riodiques, tels que ceux qui rĂ©gulent le tempo naturellement variable de la marche ou la perception rythmes musicaux. Elle Ă©tudie des rĂ©ponses neurales reflĂ©tant la discordance entre ce que le systĂšme nerveux anticipe et ce qu’il perçoit, et qui sont nĂ©cessaire pour adapter la synchronisation de mouvements Ă  un environnement variable. Elle montre aussi comment l’activitĂ© neurale Ă©voquĂ©e par un rythme musical complexe est renforcĂ©e par les mouvements qui y sont synchronisĂ©s. Enfin, elle s’intĂ©resse Ă  ces rythmes neuraux chez des patients ayant des troubles de la marche ou de la conscience.Rhythms are central in human behaviours spanning from locomotion to music performance. In dance, self-sustaining and dynamically adapting neural oscillations entrain to the regular auditory inputs that is the musical beat. This entrainment leads to anticipation of forthcoming sensory events, which in turn allows synchronization of movements to the perceived environment. This dissertation develops novel technical approaches to investigate neural rhythms that are not strictly periodic, such as naturally tempo-varying locomotion movements and rhythms of music. It studies neural responses reflecting the discordance between what the nervous system anticipates and the actual timing of events, and that are critical for synchronizing movements to a changing environment. It also shows how the neural activity elicited by a musical rhythm is shaped by how we move. Finally, it investigates such neural rhythms in patient with gait or consciousness disorders

    Lower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech

    Get PDF
    How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1–3 Hz and 4–7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing

    Evoked responses to rhythmic visual stimulation vary across sources of intrinsic alpha activity in humans

    Get PDF
    Rhythmic flickering visual stimulation produces steady-state visually evoked potentials (SSVEPs) in electroencephalogram (EEG) recordings. Based on electrode-level analyses, two dichotomous models of the underpinning mechanisms leading to SSVEP generation have been proposed: entrainment or superposition, i.e., phase-alignment or independence of endogenous brain oscillations from flicker-induced oscillations, respectively. Electrode-level analyses, however, represent an averaged view of underlying ‘source-level’ activity, at which variability in SSVEPs may lie, possibly suggesting the co-existence of multiple mechanisms. To probe this idea, we investigated the variability of SSVEPs derived from the sources underpinning scalp EEG responses during presentation of a flickering radial checkerboard. Flicker was presented between 6 and 12 Hz in 1 Hz steps, and at individual alpha frequency (IAF i.e., the dominant frequency of endogenous alpha oscillatory activity). We tested whether sources of endogenous alpha activity could be dissociated according to evoked responses to different flicker frequencies relative to IAF. Occipitoparietal sources were identified by temporal independent component analysis, maximal resting-state alpha power at IAF and source localisation. The pattern of SSVEPs to rhythmic flicker relative to IAF was estimated by correlation coefficients, describing the correlation between the peak-to-peak amplitude of the SSVEP and the absolute distance of the flicker frequency from IAF across flicker conditions. We observed extreme variability in correlation coefficients across sources, ranging from −0.84 to 0.93, with sources showing largely different coefficients co-existing within subjects. This result demonstrates variation in evoked responses to flicker across sources of endogenous alpha oscillatory activity. Data support the idea of multiple SSVEP mechanisms
    • 

    corecore