145 research outputs found

    Evidence for spatiotemporally distinct effects of image repetition and perceptual expectations as measured by event-related potentials

    Get PDF
    © 2017 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).Repeated stimulus presentation leads to reductions in responses of cortical neurons, known as repetition suppression or stimulus-specific adaptation. Circuit-based models of repetition suppression provide a framework for investigating patterns of repetition effects that propagate through cortical hierarchies. To further develop such models it is critical to determine whether (and if so, when) repetition effects are modulated by factors such as expectation and attention. We investigated whether repetition effects are influenced by perceptual expectations, and whether the time courses of each effect are similar or distinct, by presenting pairs of repeated and alternating face images and orthogonally manipulating expectations regarding the likelihood of stimulus repetition. Event-related potentials (ERPs) were recorded from n = 39 healthy adults, to map the spatiotemporal progression of stimulus repetition and stimulus expectation effects, and interactions between these, using mass univariate analyses. We also tested for another expectation effect that may contribute to repetition effects in many previous experiments: that repeated stimulus identities are predictable after seeing the first stimulus in a trial, but unrepeated stimulus identities cannot be predicted. Separate blocks were presented with predictable and unpredictable alternating face identities. Multiple repetition and expectation effects were identified between 99 and 800ms from stimulus onset, which did not statistically interact at any point and exhibited distinct spatiotemporal patterns of effects. Repetition effects in blocks with predictable alternating faces were smaller than in unpredictable alternating face blocks between 117-179 ms and 506–652ms, and larger between 246 and 428ms. The distinct spatiotemporal patterns of repetition and expectation effects support separable mechanisms underlying these phenomena. However, previous studies of repetition effects, in which the repeated (but not unrepeated) stimulus was predictable, are likely to have conflated repetition and stimulus predictability effects

    Prediction-related neural response alterations in the ventral visual stream

    Get PDF
    Theories of predictive coding (PC; Rao & Ballard, 1999) have dominated neurocognitive research in explaining thought and perception processes in various domains. The basic principle is that perception relies not only on bottom-up processing of sensory input but also on top-down predictions. The current thesis describes several neuronal response alterations in cortical visual areas measured with neuroimaging methods. The so-called repetition suppression (RS) effect was connected to predictive coding as repetitions make stimuli more expected, which results in a smaller prediction error and therefore attenuated neuronal activity. Still, it is questioned whether RS reflects the PE or is a local process by neuronal populations that occurs without top-down influences (Grill-Spector et al., 2006). Another often investigated effect is the reduced neuronal response to expected or predicted visual input called expectation suppression (ES). A considerable body of research on contextual response changes, such as RS and ES, relates to the visual system and the face-processing network in particular. Overall, we demonstrate the importance of stimulus predictability for studies using RS to uncover expectancy-related effects. Furthermore, we suggest that the influence of sensory precision on measures of RS and ES needs more attention in future research. Concerning the stimulus material in the presented studies - unfamiliar, visually familiar, and famous familiar faces - we also emphasize the importance of thoroughly considering the characteristics of faces in terms of prior belief and sensory input precision and predictability when using them for testing prediction-related effects

    Babies and Brains: Habituation in Infant Cognition and Functional Neuroimaging

    Get PDF
    Many prominent studies of infant cognition over the past two decades have relied on the fact that infants habituate to repeated stimuli – i.e. that their looking times tend to decline upon repeated stimulus presentations. This phenomenon had been exploited to reveal a great deal about the minds of preverbal infants. Many prominent studies of the neural bases of adult cognition over the past decade have relied on the fact that brain regions habituate to repeated stimuli – i.e. that the hemodynamic responses observed in fMRI tend to decline upon repeated stimulus presentations. This phenomenon has been exploited to reveal a great deal about the neural mechanisms of perception and cognition. Similarities in the mechanics of these two forms of habituation suggest that it may be useful to relate them to each other. Here we outline this analogy, explore its nuances, and highlight some ways in which the study of habituation in functional neuroimaging could yield novel insights into the nature of habituation in infant cognition – and vice versa

    Brain Responses Track Patterns in Sound

    Get PDF
    This thesis uses specifically structured sound sequences, with electroencephalography (EEG) recording and behavioural tasks, to understand how the brain forms and updates a model of the auditory world. Experimental chapters 3-7 address different effects arising from statistical predictability, stimulus repetition and surprise. Stimuli comprised tone sequences, with frequencies varying in regular or random patterns. In Chapter 3, EEG data demonstrate fast recognition of predictable patterns, shown by an increase in responses to regular relative to random sequences. Behavioural experiments investigate attentional capture by stimulus structure, suggesting that regular sequences are easier to ignore. Responses to repetitive stimulation generally exhibit suppression, thought to form a building block of regularity learning. However, the patterns used in this thesis show the opposite effect, where predictable patterns show a strongly enhanced brain response, compared to frequency-matched random sequences. Chapter 4 presents a study which reconciles auditory sequence predictability and repetition in a single paradigm. Results indicate a system for automatic predictability monitoring which is distinct from, but concurrent with, repetition suppression. The brain’s internal model can be investigated via the response to rule violations. Chapters 5 and 6 present behavioural and EEG experiments where violations are inserted in the sequences. Outlier tones within regular sequences evoked a larger response than matched outliers in random sequences. However, this effect was not present when the violation comprised a silent gap. Chapter 7 concerns the ability of the brain to update an existing model. Regular patterns transitioned to a different rule, keeping the frequency content constant. Responses show a period of adjustment to the rule change, followed by a return to tracking the predictability of the sequence. These findings are consistent with the notion that the brain continually maintains a detailed representation of ongoing sensory input and that this representation shapes the processing of incoming information

    Perceptual Expectations of Object Stimuli Modulate Repetition Suppression in a Delayed Repetition Design

    Get PDF
    Several fMRI and EEG/MEG studies show that repetition suppression (RS) effects are stronger when a stimulus repetition is expected compared to when a stimulus repetition is less expected. To date, the prevalent way to assess the influence of expectations on RS is via immediate stimulus repetition designs, that is, no intervening stimuli appear between the initial and repeated presentation of a stimulus. Since there is evidence that repetition lag may alter RS effects in a qualitative manner, the current study investigated how perceptual expectations modify RS effects on object stimuli when repetition lag is relatively long. Region of interest analyses in the left occipital cortex revealed a similar activation pattern as identified in previous studies on immediate lag: RS effects were strongest when repetitions were expected compared to decreased RS effects when repetitions were less expected. Therefore, the current study expands previous research in two ways: First, we replicate prior studies showing that perceptual expectation effects can be observed in object-sensitive occipital areas. Second, the finding that expectation effects can be found even for several-minute lags proposes that Bayesian inference processes are a relatively robust component in visual stimulus processing.P 30390-B27(VLID)286101

    Unexpected sound omissions are signaled in human posterior superior temporal gyrus: An intracranial study

    Get PDF
    Context modulates sensory neural activations enhancing perceptual and behavioral performance and reducing prediction errors. However, the mechanism of when and where these high-level expectations act on sensory processing is unclear. Here, we isolate the effect of expectation absent of any auditory evoked activity by assessing the response to omitted expected sounds. Electrocorticographic signals were recorded directly from subdural electrode grids placed over the superior temporal gyrus (STG). Subjects listened to a predictable sequence of syllables, with some infrequently omitted. We found high-frequency band activity (HFA, 70-170 Hz) in response to omissions, which overlapped with a posterior subset of auditory-active electrodes in STG. Heard syllables could be distinguishable reliably from STG, but not the identity of the omitted stimulus. Both omission- and target-detection responses were also observed in the prefrontal cortex. We propose that the posterior STG is central for implementing predictions in the auditory environment. HFA omission responses in this region appear to index mismatch-signaling or salience detection processes

    Neural mechanisms underlying the influence of sequential predictions on scene gist recognition

    Get PDF
    Doctor of PhilosophyDepartment of Psychological SciencesLester C. LoschkyRapid scene categorization is typically argued to be a purely feed-forward process. Yet, when navigating in our environment, we usually see predictable sequences of scene categories (e.g., offices followed by hallways, parking lots followed by sidewalks, etc.). Previous work showed that scenes are easier to categorize when they are shown in ecologically valid, predictable sequences compared to when they are shown in randomized sequences (Smith & Loschky, 2019). Given the number of stages involved in constructing a scene representation, we asked a novel research question: when in the time course of scene processing do sequential predictions begin to facilitate scene categorization? We addressed this question by measuring the temporal dynamics of scene categorization with electroencephalography. Participants saw scenes in either spatiotemporally coherent sequences (first-person viewpoint of navigating, from, say, an office to a classroom) or their randomized versions. Participants saw 10 scenes, presented in rapid serial visual presentation (RSVP), on each trial, while we recorded their visually event related potentials (vERPs). They categorized 1 of the 10 scenes from an 8 alternative forced choice (AFC) array of scene category labels. We first compared event related potentials evoked by scenes in coherent and randomized sequences. In a subsequent, more detailed analysis, we constructed scene category decoders based on the temporally resolved neural activity. Using confusion matrices, we tracked how well the pattern of errors from neural decoders explain the behavioral responses over time and compared this ability when scenes were shown in coherent or randomized sequences. We found reduced vERP amplitudes for targets in coherent sequences roughly 150 milliseconds after scene onset, when vERPs first index rapid scene categorization, and during the N400 component, suggesting a reduced semantic integration cost in coherent sequences. Critically, we also found that confusions made by neural decoders and human responses correlate more strongly in coherent sequences, beginning around 100 milliseconds. Taken together, these results suggest that predictions of upcoming scene categories influence even the earliest stages of scene processing, affecting both the extraction of visual properties and meaning

    Predictive feedback to the primary visual cortex during saccades

    Get PDF
    Perception of our sensory environment is actively constructed from sensory input and prior expectations. These expectations are created from knowledge of the world through semantic memories, spatial and temporal contexts, and learning. Multiple frameworks have been created to conceptualise this active perception, these frameworks will be further referred to as inference models. There are three elements of inference models which have prevailed in these frameworks. Firstly, the presence of internal generative models for the visual environment, secondly feedback connections which project prediction signals of the model to lower cortical processing areas to interact with sensory input, and thirdly prediction errors which are produced when the sensory input is not predicted by feedback signals. The prediction errors are thought to be fed-forward to update the generative models. These elements enable hypothesis driven testing of active perception. In vision, error signals have been found in the primary visual cortex (V1). V1 is organised retinotopically; the structure of sensory stimulus that enters through the retina is retained within V1. A semblance of that structure exists in feedback predictive signals and error signal production. The feedback predictions interact with the retinotopically specific sensory input which can result in error signal production within that region. Due to the nature of vision, we rapidly sample our visual environment using ballistic eye-movements called saccades. Therefore, input to V1 is updated about three times per second. One assumption of active perception frameworks is that predictive signals can update to new retinotopic locations of V1 with sensory input. This thesis investigates the ability of active perception to redirect predictive signals to new retinotopic locations with saccades. The aim of the thesis is to provide evidence of the relevance of generative models in a more naturalistic viewing paradigm (i.e. across saccades). An introduction into active visual perception is provided in Chapter 1. Structural connections and functional feedback to V1 are described at a global level and at the level of cortical layers. The role of feedback connections to V1 is then discussed in the light of current models, which hones in on inference models of perception. The elements of inferential models are introduced including internal generative models, predictive feedback, and error signal production. The assumption of predictive feedback relocation in V1 with saccades is highlighted alongside the effects of saccades within the early visual system, which leads to the motivation and introduction of the research chapters. A psychophysical study is presented in Chapter 2 which provides evidence for the transference of predictive signals across saccades. An internal model of spatiotemporal motion was created using an illusion of motion. The perception of illusory motion signifies the engagement of an internal model as a moving token is internally constructed from the sensory input. The model was tested by presenting in-time (predictable) and out-of-time (unpredictable) targets on the trace of perceived motion. Saccades were initiated across the illusion every three seconds to cause a relocation of predictive feedback. Predictable in-time targets were better detected than the unpredictable out-of-time targets. Importantly, the detection advantage for in-time targets was found 50 – 100 ms after saccade indicating transference of predictive signals across saccade. Evidence for the transfer of spatiotemporally predictive feedback across saccade was supported by the fMRI study presented in Chapter 3. Previous studies have demonstrated an increased activity when processing unpredicted visual stimulation in V1. This activity increase has been related to error signal production as the input was not predicted via feedback signals. In Chapter 3, the motion illusion paradigm used in Chapter 2 was redesigned to be compatible with brain activation analysis. The internal model of motion was created prior to saccade and tested at a post-saccadic retinotopic region of V1. An increased activation was found for spatiotemporally unpredictable stimuli directly after eye-movement, indicating the predictive feedback was projected to the new retinotopic region with saccade. An fMRI experiment was conducted in Chapter 4 to demonstrate that predictive feedback relocation was not limited to motion processing in the dorsal stream. This was achieved by using natural scene images which are known to incorporate ventral stream processing. Multivariate analysis was performed to determine if feedback signals pertaining to natural scenes could relocate to new retinotopic eye-movements with saccade. The predictive characteristic of feedback was also tested by changing the image content across eye-movements to determine if an error signal was produced due to the unexpected post-saccadic sensory input. Predictive feedback was found to interact with the images presented post-saccade, indicating that feedback relocated with saccade. The predictive feedback was thought to contain contextual information related to the image processed prior to saccade. These three chapters provide evidence for inference models contributing to visual perception during more naturalistic viewing conditions (i.e. across saccades). These findings are summarised in Chapter 5 in relation to inference model frameworks, transsacadic perception, and attention. The discussion focuses on the interaction of internal generative models and trans-saccadic perception in the aim of highlighting several consistencies between the two cognitive processes

    From locomotion to dance and back : exploring rhythmic sensorimotor synchronization

    Full text link
    Le rythme est un aspect important du mouvement et de la perception de l’environnement. Lorsque l’on danse, la pulsation musicale induit une activité neurale oscillatoire qui permet au système nerveux d’anticiper les évènements musicaux à venir. Le système moteur peut alors s’y synchroniser. Cette thèse développe de nouvelles techniques d’investigation des rythmes neuraux non strictement périodiques, tels que ceux qui régulent le tempo naturellement variable de la marche ou la perception rythmes musicaux. Elle étudie des réponses neurales reflétant la discordance entre ce que le système nerveux anticipe et ce qu’il perçoit, et qui sont nécessaire pour adapter la synchronisation de mouvements à un environnement variable. Elle montre aussi comment l’activité neurale évoquée par un rythme musical complexe est renforcée par les mouvements qui y sont synchronisés. Enfin, elle s’intéresse à ces rythmes neuraux chez des patients ayant des troubles de la marche ou de la conscience.Rhythms are central in human behaviours spanning from locomotion to music performance. In dance, self-sustaining and dynamically adapting neural oscillations entrain to the regular auditory inputs that is the musical beat. This entrainment leads to anticipation of forthcoming sensory events, which in turn allows synchronization of movements to the perceived environment. This dissertation develops novel technical approaches to investigate neural rhythms that are not strictly periodic, such as naturally tempo-varying locomotion movements and rhythms of music. It studies neural responses reflecting the discordance between what the nervous system anticipates and the actual timing of events, and that are critical for synchronizing movements to a changing environment. It also shows how the neural activity elicited by a musical rhythm is shaped by how we move. Finally, it investigates such neural rhythms in patient with gait or consciousness disorders
    corecore