729 research outputs found

    High-frequency neural oscillations and visual processing deficits in schizophrenia

    Get PDF
    Visual information is fundamental to how we understand our environment, make predictions, and interact with others. Recent research has underscored the importance of visuo-perceptual dysfunctions for cognitive deficits and pathophysiological processes in schizophrenia. In the current paper, we review evidence for the relevance of high frequency (beta/gamma) oscillations towards visuo-perceptual dysfunctions in schizophrenia. In the first part of the paper, we examine the relationship between beta/gamma band oscillations and visual processing during normal brain functioning. We then summarize EEG/MEG-studies which demonstrate reduced amplitude and synchrony of high-frequency activity during visual stimulation in schizophrenia. In the final part of the paper, we identify neurobiological correlates as well as offer perspectives for future research to stimulate further inquiry into the role of high-frequency oscillations in visual processing impairments in the disorder

    Computational models of auditory perception from feature extraction to stream segregation and behavior

    Get PDF
    This is the final version. Available on open access from Elsevier via the DOI in this recordData availability: This is a review study, and as such did not generate any new data.Audition is by nature dynamic, from brainstem processing on sub-millisecond time scales, to segregating and tracking sound sources with changing features, to the pleasure of listening to music and the satisfaction of getting the beat. We review recent advances from computational models of sound localization, of auditory stream segregation and of beat perception/generation. A wealth of behavioral, electrophysiological and imaging studies shed light on these processes, typically with synthesized sounds having regular temporal structure. Computational models integrate knowledge from different experimental fields and at different levels of description. We advocate a neuromechanistic modeling approach that incorporates knowledge of the auditory system from various fields, that utilizes plausible neural mechanisms, and that bridges our understanding across disciplines.Engineering and Physical Sciences Research Council (EPSRC

    Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing

    Get PDF
    Selective attention to a spatial location has shown enhanced perception and facilitate behavior for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of sync with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either color or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory) was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late) with the rhythmic cue. Results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced behavior independently

    Auditory streaming and bistability paradigm extended to a dynamic environment

    Get PDF
    This is the final version. Available on open access from Elsevier via the DOI in this recordData availability: All experimental data and model code are available in the github repository james-rankin/auditory-streaming: https://github.com/james-rankin/auditory-streamingWe explore stream segregation with temporally modulated acoustic features using behavioral experiments and modelling. The auditory streaming paradigm in which alternating high- A and low-frequency tones B appear in a repeating ABA-pattern, has been shown to be perceptually bistable for extended presentations (order of minutes). For a fixed, repeating stimulus, perception spontaneously changes (switches) at random times, every 2–15 s, between an integrated interpretation with a galloping rhythm and segregated streams. Streaming in a natural auditory environment requires segregation of auditory objects with features that evolve over time. With the relatively idealized ABA-triplet paradigm, we explore perceptual switching in a non-static environment by considering slowly and periodically varying stimulus features. Our previously published model captures the dynamics of auditory bistability and predicts here how perceptual switches are entrained, tightly locked to the rising and falling phase of modulation. In psychoacoustic experiments we find that entrainment depends on both the period of modulation and the intrinsic switch characteristics of individual listeners. The extended auditory streaming paradigm with slowly modulated stimulus features presented here will be of significant interest for future imaging and neurophysiology experiments by reducing the need for subjective perceptual reports of ongoing perception.Swartz FoundationEngineering and Physical Sciences Research Council (EPSRC

    Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing

    Get PDF
    Selective attention to a spatial location has shown enhanced perception and facilitate behavior for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of sync with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either color or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory) was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late) with the rhythmic cue. Results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced behavior independently

    Music, Language, and Rhythmic Timing

    Get PDF
    Neural, perceptual, and cognitive oscillations synchronize with rhythmic events in both speech (Luo & Poeppel, 2007) and music (Snyder & Large, 2005). This synchronization decreases perceptual thresholds to temporally predictable events (Lawrance et al., 2014), improves task performance (Ellis & Jones, 2010), and enables speech intelligibility (Peelle & Davis, 2012). Despite implications of music-language transfer effects for improving language outcomes (Gordon et al., 2015), proposals that shared neural and cognitive resources underlie music and speech rhythm perception (e.g., Tierney & Kraus, 2014) are not yet substantiated. The present research aimed to explore this potential overlap by testing whether music-induced oscillations affect metric speech tempo perception, and vice versa. We presented in each of 432 trials a prime sequence (seven repetitions of either a metric speech utterance or analogous musical phrase) followed by a standard-comparison pair (either two identical speech utterances or two identical musical phrases). Twenty-two participants judged whether the comparison was slower than, faster than, or the same tempo as the standard. We manipulated whether the prime was slower than, faster than, or the same tempo as the standard. Tempo discrimination accuracy was higher when the standard tempo was the same as, compared to slower or faster than, the prime tempo. These findings support the shared-resources view more than the independent-resources view, and they have implications for music-language transfer effects showing improvements in verbal memory (Chan et al., 1998), speech-in-noise perception (Strait et al., 2012), and reading ability in children and adults (Tierney & Kraus, 2013)

    Synchronous, but not entrained: Exogenous and endogenous cortical rhythms of speech and language processing

    Get PDF
    Research into speech processing is often focused on a phenomenon termed ‘entrainment’, whereby the cortex shadows rhythmic acoustic information with oscillatory activity. Entrainment has been observed to a range of rhythms present in speech; in addition, synchronicity with abstract information (e.g., syntactic structures) has been observed. Entrainment accounts face two challenges: First, speech is not exactly rhythmic; second, synchronicity with representations that lack a clear acoustic counterpart has been described. We propose that apparent entrainment does not always result from acoustic information. Rather, internal rhythms may have functionalities in the generation of abstract representations and predictions. While acoustics may often provide punctate opportunities for entrainment, internal rhythms may also live a life of their own to infer and predict information, leading to intrinsic synchronicity—not to be counted as entrainment. This possibility may open up new research avenues in the psycho– and neurolinguistic study of language processing and language development

    Neural entrainment to the beat in multiple frequency bands in 6-7-year-old children

    Get PDF
    Entrainment to periodic acoustic stimuli has been found to relate both to the auditory and motor cortices, and it could be influenced by the maturity of these brain regions. However, existing research in this topic provides data about different oscillatory brain activities in different age groups with different musical background. In order to obtain a more coherent picture and examine early manifestations of entrainment, we assessed brain oscillations at multiple time scales (beta: 15-25 Hz, gamma: 28-48 Hz) and in steady state evoked potentials (SS-EPs in short) in 6-7-year-old children with no musical background right at the start of primary school before they learnt to read. Our goal was to exclude the effect of music training and reading, since previous studies have shown that sensorimotor entrainment (movement synchronization to the beat) is related to musical and reading abilities. We found evidence for endogenous anticipatory processing in the gamma band related to meter perception, and stimulus-related frequency specific responses. However, we did not find evidence for an interaction between auditory and motor networks, which suggests that endogenous mechanisms related to auditory processing may mature earlier than those that underlie motor actions, such as sensorimotor synchronization

    Formation of visual memories controlled by gamma power phase-locked to alpha oscillations

    Get PDF
    Neuronal oscillations provide a window for understanding the brain dynamics that organize the flow of information from sensory to memory areas. While it has been suggested that gamma power reflects feedforward processing and alpha oscillations feedback control, it remains unknown how these oscillations dynamically interact. Magnetoencephalography (MEG) data was acquired from healthy subjects who were cued to either remember or not remember presented pictures. Our analysis revealed that in anticipation of a picture to be remembered, alpha power decreased while the cross-frequency coupling between gamma power and alpha phase increased. A measure of directionality between alpha phase and gamma power predicted individual ability to encode memory: stronger control of alpha phase over gamma power was associated with better memory. These findings demonstrate that encoding of visual information is reflected by a state determined by the interaction between alpha and gamma activity

    How musical rhythms entrain the human brain : clarifying the neural mechanisms of sensory-motor entrainment to rhythms

    Get PDF
    When listening to music, people across cultures tend to spontaneously perceive and move the body along a periodic pulse-like meter. Increasing evidence suggests that this ability is supported by neural mechanisms that selectively amplify periodicities corresponding to the perceived metric pulses. However, the nature of these neural mechanisms, i.e., the endogenous or exogenous factors that may selectively enhance meter periodicities in brain responses to rhythm, remains largely unknown. This question was investigated in a series of studies in which the electroencephalogram (EEG) of healthy participants was recorded while they listened to musical rhythm. From this EEG, selective contrast at meter periodicities in the elicited neural activity was captured using frequency-tagging, a method allowing direct comparison of this contrast between the sensory input, EEG response, biologically-plausible models of auditory subcortical processing, and behavioral output. The results show that the selective amplification of meter periodicities is shaped by a continuously updated combination of factors including sound spectral content, long-term training and recent context, irrespective of attentional focus and beyond auditory subcortical nonlinear processing. Together, these observations demonstrate that perception of rhythm involves a number of processes that transform the sensory input via fixed low-level nonlinearities, but also through flexible mappings shaped by prior experience at different timescales. These higher-level neural mechanisms could represent a neurobiological basis for the remarkable flexibility and stability of meter perception relative to the acoustic input, which is commonly observed within and across individuals. Fundamentally, the current results add to the evidence that evolution has endowed the human brain with an extraordinary capacity to organize, transform, and interact with rhythmic signals, to achieve adaptive behavior in a complex dynamic environment
    corecore