2,706 research outputs found

    High frequency oscillations as a correlate of visual perception

    Get PDF
    “NOTICE: this is the author’s version of a work that was accepted for publication in International journal of psychophysiology. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in International journal of psychophysiology , 79, 1, (2011) DOI 10.1016/j.ijpsycho.2010.07.004Peer reviewedPostprin

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Contribution of behavioural variability to representational drift

    Get PDF
    Neuronal responses to similar stimuli change dynamically over time, raising the question of how internal representations can provide a stable substrate for neural coding. Recent work has suggested a large degree of drift in neural representations even in sensory cortices, which are believed to store stable representations of the external world. While the drift of these representations is mostly characterized in relation to external stimuli, the behavioural state of the animal (for instance, the level of arousal) is also known to strongly modulate the neural activity. We therefore asked how the variability of such modulatory mechanisms can contribute to representational changes. We analysed large-scale recording of neural activity from the Allen Brain Observatory, which was used before to document representational drift in the mouse visual cortex. We found that, within these datasets, behavioural variability significantly contributes to representational changes. This effect was broadcasted across various cortical areas in the mouse, including the primary visual cortex, higher order visual areas, and even regions not primarily linked to vision like hippocampus. Our computational modelling suggests that these results are consistent with independent modulation of neural activity by behaviour over slower time scales. Importantly, our analysis suggests that reliable but variable modulation of neural representations by behaviour can be misinterpreted as representational drift, if neuronal representations are only characterized in the stimulus space and marginalised over behavioural parameters

    Neural representation of complex motion in the primate cortex

    Get PDF
    This dissertation is concerned with how information about the environment is represented by neural activity in the primate brain. More specifically, it contains several studies that explore the representation of visual motion in the brains of humans and nonhuman primates through behavioral and physiological measures. The majority of this work is focused on the activity of individual neurons in the medial superior temporal area (MST) – a high-level, extrastriate area of the primate visual cortex. The first two studies provide an extensive review of the scientific literature on area MST. The area’s prominent role at the intersection of low-level, bottom-up, sensory processing and high-level, top-down mechanisms is highlighted. Furthermore, a specific article on how information about self-motion and object motion can be decoded from a population of MSTd neurons is reviewed in more detail. The third study describes a published and annotated dataset of MST neurons’ responses to a series of different motion stimuli. This dataset is analyzed using a variety of different analysis approaches in the fifth study. Classical tuning curve approaches confirm that MST neurons have large, but well-defined spatial receptive fields and are independently tuned for linear and spiral motion, as well as speed. We also confirm that the tuning for spiral motion is position invariant in a majority of MST neurons. A bias-free characterization of receptive field profiles based on a new stimulus that generates smooth, complex motion patterns turned out to be predictive of some of the tuning properties of MST neurons, but was generally less informative than similar approaches have been in earlier visual areas. The fifth study introduces a new motion stimulus that consists of hexgonal segments and presents an optimization algorithm for an adaptive online analysis of neurophysiological recordings. Preliminary physiological data and simulations show these tools to have a strong potential in characterizing the response functions of MST neurons. The final study describes a behavioral experiment with human subjects that explores how different stimulus features, such as size and contrast, affect motion perception and discusses what conclusions can be drawn from that about the representation of visual motion in the human brain. Together these studies highlight the visual motion processing pathway of the primate brain as an excellent model system for studying more complex relations of neural activity and external stimuli. Area MST in particular emerges as a gateway between perception, cognition, and action planning.2021-11-1

    Spatio-temporal representations during eye movements and their neuronal correlates

    Get PDF
    During fast ballistic eye movements, so-called saccades, our visual perception undergoes a range of distinct changes. Sensitivity to luminance contrasts is reduced (saccadic suppression) and the localization of stimuli can be shifted in the direction of a saccade or is compressed around the saccade target. The temporal order of two stimuli can be perceived as inverted and the duration in between can be underestimated. The duration of a target change close to the saccade target can be overestimated, when the change occurs during the saccade (chronostasis). In my thesis I investigated the spatial and temporal profiles of peri-saccadic changes in human visual perception and explored how these might result from changes in neural activity of the macaque middle temporal area (MT). I found that peri-saccadic contrast sensitivity was only reduced by a constant factor across space when the data was analyzed in retinal coordinates (as opposed to screen coordinates), indicating that saccadic suppression occurs in an eye-centered frame of reference. I demonstrated that the found variations of saccadic suppression with the location of the stimulus appear to cause variations in the spatio-temporal pattern of another peri-saccadic misperception: chronostasis. I was able to show that, unlike previously assumed, the saccadic overestimation of time is not a spatially localized disturbance of time perception but instead spans across the whole visual field. I further determined that chronostasis is not dependent on the eye movement itself, but is rather a consequence of the visual stimulation induced by it. This result clearly segregates chronostasis from other peri-saccadic perceptual changes like saccadic suppression and the compression of space. To relate these findings to a potential neuronal basis of saccadic suppression and time perception, I measured neuronal responses of single cells in MT of an awake behaving macaque. The results provide relevant insight into the processing of stationary stimuli and pairs of stimuli during fixation and saccades in MT. Responses to the second of a pair of stimuli were strongly suppressed and response latencies increased even at onset asynchronies of about 100ms. The increase in latency is an important difference to the temporal dynamics previously reported in other brain areas as the frontal eye field in the frontal cortex and the superior colliculus in the midbrain. During saccades, response latencies to single high luminance stimuli remained unchanged. For stimuli shown during the second half of the saccade, the average responses were reduced. By comparison with responses to single stimuli at different luminance levels during fixation, I was able to show that the peri-saccadic response reduction found in MT quantitatively fit to what could be expected from known psychophysical measurements of peri-saccadic contrast sensitivity. Responses that were already reduced due to a preceding stimulus were however not subject to further reductions, indicating a possible interaction of these two response modulations. Saccadic suppression occurs in an eye-centered frame of reference with changes in perception compatible to changes in single cell activity in the macaque monkey MT. The peri-saccadic overestimation of time is influenced by saccadic suppression and the saccade-induced visual changes, but is not dependent on eye-movement related signals

    Deep Neural Networks Rival the Representation of Primate IT Cortex for Core Visual Object Recognition

    Get PDF
    The primate visual system achieves remarkable visual object recognition performance even in brief presentations and under changes to object exemplar, geometric transformations, and background variation (a.k.a. core visual object recognition). This remarkable performance is mediated by the representation formed in inferior temporal (IT) cortex. In parallel, recent advances in machine learning have led to ever higher performing models of object recognition using artificial deep neural networks (DNNs). It remains unclear, however, whether the representational performance of DNNs rivals that of the brain. To accurately produce such a comparison, a major difficulty has been a unifying metric that accounts for experimental limitations such as the amount of noise, the number of neural recording sites, and the number trials, and computational limitations such as the complexity of the decoding classifier and the number of classifier training examples. In this work we perform a direct comparison that corrects for these experimental limitations and computational considerations. As part of our methodology, we propose an extension of "kernel analysis" that measures the generalization accuracy as a function of representational complexity. Our evaluations show that, unlike previous bio-inspired models, the latest DNNs rival the representational performance of IT cortex on this visual object recognition task. Furthermore, we show that models that perform well on measures of representational performance also perform well on measures of representational similarity to IT and on measures of predicting individual IT multi-unit responses. Whether these DNNs rely on computational mechanisms similar to the primate visual system is yet to be determined, but, unlike all previous bio-inspired models, that possibility cannot be ruled out merely on representational performance grounds.Comment: 35 pages, 12 figures, extends and expands upon arXiv:1301.353

    The Timing of Vision – How Neural Processing Links to Different Temporal Dynamics

    Get PDF
    In this review, we describe our recent attempts to model the neural correlates of visual perception with biologically inspired networks of spiking neurons, emphasizing the dynamical aspects. Experimental evidence suggests distinct processing modes depending on the type of task the visual system is engaged in. A first mode, crucial for object recognition, deals with rapidly extracting the glimpse of a visual scene in the first 100 ms after its presentation. The promptness of this process points to mainly feedforward processing, which relies on latency coding, and may be shaped by spike timing-dependent plasticity (STDP). Our simulations confirm the plausibility and efficiency of such a scheme. A second mode can be engaged whenever one needs to perform finer perceptual discrimination through evidence accumulation on the order of 400 ms and above. Here, our simulations, together with theoretical considerations, show how predominantly local recurrent connections and long neural time-constants enable the integration and build-up of firing rates on this timescale. In particular, we review how a non-linear model with attractor states induced by strong recurrent connectivity provides straightforward explanations for several recent experimental observations. A third mode, involving additional top-down attentional signals, is relevant for more complex visual scene processing. In the model, as in the brain, these top-down attentional signals shape visual processing by biasing the competition between different pools of neurons. The winning pools may not only have a higher firing rate, but also more synchronous oscillatory activity. This fourth mode, oscillatory activity, leads to faster reaction times and enhanced information transfers in the model. This has indeed been observed experimentally. Moreover, oscillatory activity can format spike times and encode information in the spike phases with respect to the oscillatory cycle. This phenomenon is referred to as “phase-of-firing coding,” and experimental evidence for it is accumulating in the visual system. Simulations show that this code can again be efficiently decoded by STDP. Future work should focus on continuous natural vision, bio-inspired hardware vision systems, and novel experimental paradigms to further distinguish current modeling approaches

    Temporal dynamics, sensitivity and form discrimination in blindsight

    Get PDF
    • …
    corecore