1,607 research outputs found

    Top-down effects on early visual processing in humans: a predictive coding framework

    Get PDF
    An increasing number of human electroencephalography (EEG) studies examining the earliest component of the visual evoked potential, the so-called C1, have cast doubts on the previously prevalent notion that this component is impermeable to top-down effects. This article reviews the original studies that (i) described the C1, (ii) linked it to primary visual cortex (V1) activity, and (iii) suggested that its electrophysiological characteristics are exclusively determined by low-level stimulus attributes, particularly the spatial position of the stimulus within the visual field. We then describe conflicting evidence from animal studies and human neuroimaging experiments and provide an overview of recent EEG and magnetoencephalography (MEG) work showing that initial V1 activity in humans may be strongly modulated by higher-level cognitive factors. Finally, we formulate a theoretical framework for understanding top-down effects on early visual processing in terms of predictive coding

    Contextual modulation of primary visual cortex by auditory signals

    Get PDF
    Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’

    Encoding of temporal probabilities in the human brain

    Get PDF
    Anticipating the timing of future events is a necessary precursor to preparing actions and allocating resources to sensory processing. This requires elapsed time to be represented in the brain and used to predict the temporal probability of upcoming events. While neuropsychological, imaging, magnetic stimulation studies, and single-unit recordings implicate the role of higher parietal and motor-related areas in temporal estimation, the role of earlier, purely sensory structures remains more controversial. Here we demonstrate that the temporal probability of expected visual events is encoded not by a single area but by a wide network that importantly includes neuronal populations at the very earliest cortical stages of visual processing. Moreover, we show that activity in those areas changes dynamically in a manner that closely accords with temporal expectations

    Distinct causal influences of parietal versus frontal areas on human visual cortex: evidence from concurrent TMS-fMRI

    Get PDF
    It has often been proposed that regions of the human parietal and/or frontal lobe may modulate activity in visual cortex, for example, during selective attention or saccade preparation. However, direct evidence for such causal claims is largely missing in human studies, and it remains unclear to what degree the putative roles of parietal and frontal regions in modulating visual cortex may differ. Here we used transcranial magnetic stimulation (TMS) and functional magnetic resonance imaging (fMRI) concurrently, to show that stimulating right human intraparietal sulcus (IPS, at a site previously implicated in attention) elicits a pattern of activity changes in visual cortex that strongly depends on current visual context. Increased intensity of IPS TMS affected the blood oxygen level–dependent (BOLD) signal in V5/MT+ only when moving stimuli were present to drive this visual region, whereas TMS-elicited BOLD signal changes were observed in areas V1–V4 only during the absence of visual input. These influences of IPS TMS upon remote visual cortex differed significantly from corresponding effects of frontal (eye field) TMS, in terms of how they related to current visual input and their spatial topography for retinotopic areas V1–V4. Our results show directly that parietal and frontal regions can indeed have distinct patterns of causal influence upon functional activity in human visual cortex. Key words: attention, frontal cortex, functional magnetic resonance imaging, parietal cortex, top--down, transcranial magnetic stimulatio

    Timing of visual stimuli in V1 and V5/MT: fMRI and TMS

    Get PDF
    Time perception is used in our day-to-day activities. While we understand quite well how our brain processes vision, touch or taste, brain mechanisms subserving time perception remain largely understudied. In this study, we extended an experiment of previous master thesis run by Tatiana Kenel-Pierre. We focused on time perception in the range of milliseconds. Previous studies have demonstrated the involvement of visual areas V1 and V5/MT in the encoding of temporal information of visual stimuli. Based on these previous findings the aim of the present study was to understand if temporal information was encoded in V1 and extrastriate area V5/MT in different spatial frames i.e., head- centered versus eye-centered. To this purpose we asked eleven healthy volunteers to perform a temporal discrimination task of visual stimuli. Stimuli were presented at 4 different spatial positions (i.e., different combinations of retinotopic and spatiotopic position). While participants were engaged in this task we interfered with the activity of the right dorsal V1 and the right V5/MT with transcranial magnetic stimulation (TMS). Our preliminary results showed that TMS over both V1 and V5/MT impaired temporal discrimination of visual stimuli presented at specific spatial coordinates. But whereas TMS over V1 impaired temporal discrimination of stimuli presented in the lower left quadrant, TMS over V5/MT affected temporal discrimination of stimuli presented at the top left quadrant. Although it is always difficult to draw conclusions from preliminary results, we could tentatively say that our data seem to suggest that both V1 and V5/MT encode visual temporal information in specific spatial frames

    Contributions of cortical feedback to sensory processing in primary visual cortex

    Get PDF
    Closing the structure-function divide is more challenging in the brain than in any other organ (Lichtman and Denk, 2011). For example, in early visual cortex, feedback projections to V1 can be quantified (e.g., Budd, 1998) but the understanding of feedback function is comparatively rudimentary (Muckli and Petro, 2013). Focusing on the function of feedback, we discuss how textbook descriptions mask the complexity of V1 responses, and how feedback and local activity reflects not only sensory processing but internal brain states

    Brain areas associated with visual spatial attention display topographic organization during auditory spatial attention

    Full text link
    Spatially selective modulation of alpha power (8–14 Hz) is a robust finding in electrophysiological studies of visual attention, and has been recently generalized to auditory spatial attention. This modulation pattern is interpreted as reflecting a top-down mechanism for suppressing distracting input from unattended directions of sound origin. The present study on auditory spatial attention extends this interpretation by demonstrating that alpha power modulation is closely linked to oculomotor action. We designed an auditory paradigm in which participants were required to attend to upcoming sounds from one of 24 loudspeakers arranged in a circular array around the head. Maintaining the location of an auditory cue was associated with a topographically modulated distribution of posterior alpha power resembling the findings known from visual attention. Multivariate analyses allowed the prediction of the sound location in the horizontal plane. Importantly, this prediction was also possible, when derived from signals capturing saccadic activity. A control experiment on auditory spatial attention confirmed that, in absence of any visual/auditory input, lateralization of alpha power is linked to the lateralized direction of gaze. Attending to an auditory target engages oculomotor and visual cortical areas in a topographic manner akin to the retinotopic organization associated with visual attention

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task
    corecore