87 research outputs found

    Meta-analyses support a taxonomic model for representations of different categories of audio-visual interaction events in the human brain

    Get PDF
    Our ability to perceive meaningful action events involving objects, people and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical “hubs”) preferentially involved in multisensory processing along different stimulus category dimensions, including (1) living versus non-living audio-visual events, (2) audio-visual events involving vocalizations versus actions by living sources, (3) emotionally valent events, and (4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies

    The dorsal visual stream revisited: Stable circuits or dynamic pathways?

    Get PDF
    In both macaque and human brain, information regarding visual motion flows from the extrastriate area V6 along two different paths: a dorsolateral one towards areas MT/V5, MST, V3A, and a dorsomedial one towards the visuomotor areas of the superior parietal lobule (V6A, MIP, VIP). The dorsolateral visual stream is involved in many aspects of visual motion analysis, including the recognition of object motion and self motion. The dorsomedial stream uses visual motion information to continuously monitor the spatial location of objects while we are looking and/or moving around, to allow skilled reaching for and grasping of the objects in structured, dynamically changing environments. Grasping activity is present in two areas of the dorsal stream, AIP and V6A. Area AIP is more involved than V6A in object recognition, V6A in encoding vision for action. We suggest that V6A is involved in the fast control of prehension and plays a critical role in biomechanically selecting appropriate postures during reach to grasp behaviors.In everyday life, numerous functional networks, often involving the same cortical areas, are continuously in action in the dorsal visual stream, with each network dynamically activated or inhibited according to the context. The dorsolateral and dorsomedial streams represent only two examples of these networks. Many others streams have been described in the literature, but it is worthwhile noting that the same cortical area, and even the same neurons within an area, are not specific for just one functional property, being part of networks that encode multiple functional aspects. Our proposal is to conceive the cortical streams not as fixed series of interconnected cortical areas in which each area belongs univocally to one stream and is strictly involved in only one function, but as interconnected neuronal networks, often involving the same neurons, that are involved in a number of functional processes and whose activation changes dynamically according to the context

    The effects of stereo disparity on the behavioural and electrophysiological correlates of audio-visual motion in depth.

    Get PDF
    Motion is represented by low-level signals, such as size-expansion in vision or loudness changes in the auditory modality. The visual and auditory signals from the same object or event may be integrated and facilitate detection. We explored behavioural and electrophysiological correlates of congruent and incongruent audio-visual depth motion in conditions where auditory level changes, visual expansion, and visual disparity cues were manipulated. In Experiment 1 participants discriminated auditory motion direction whilst viewing looming or receding, 2D or 3D, visual stimuli. Responses were faster and more accurate for congruent than for incongruent audio-visual cues, and the congruency effect (i.e., difference between incongruent and congruent conditions) was larger for visual 3D cues compared to 2D cues. In Experiment 2, event-related potentials (ERPs) were collected during presentation of the 2D and 3D, looming and receding, audio-visual stimuli, while participants detected an infrequent deviant sound. Our main finding was that audio-visual congruity was affected by retinal disparity at an early processing stage (135 – 160 ms) over occipito-parietal scalp. Topographic analyses suggested that similar brain networks were activated for the 2D and 3D congruity effects, but that cortical responses were stronger in the 3D condition. Differences between congruent and incongruent conditions were observed between 140 – 200 ms, 220 – 280 ms, and 350 – 500 ms after stimulus onset

    Assessing the impact of emotion in dual pathway models of sensory processing.

    Get PDF
    In our daily environment, we are constantly encountering an endless stream of information which we must be able to sort and prioritize. Some of the features that influence this are the emotional nature of stimuli and the emotional context of events. Emotional information is often given preferential access to neurocognitive resources, including within sensory processing systems. Interestingly, both auditory and visual systems are divided into dual processing streams; a ventral object identity/perception stream and a dorsal object location/action stream. While effects of emotion on the ventral streams are relatively well defined, its effect on dorsal stream processes remains unclear. The present thesis aimed to investigate the impact of emotion on sensory systems within a dual pathway framework of sensory processing. Study I investigated the role of emotion during auditory localization. While undergoing fMRI, participants indicated the location of an emotional or non-emotional sound within an auditory virtual environment. This revealed that the neurocognitive structures displaying activation modulated by emotion were not the same as those modulated by sound location. Emotion was represented in regions associated with the putative auditory ‘what’ but not ‘where’ stream. Study II examined the impact of emotion on ostensibly similar localization behaviours mediated differentially by the dorsal versus ventral visual processing stream. Ventrally-mediated behaviours were demonstrated to be impacted by the emotional context of a trial, while dorsally-mediated behaviours were not. For Study III, a motion-aftereffect paradigm was used to investigate the impact of emotion on visual area V5/MT+. This area, traditionally believed to be involved in dorsal stream processing, has a number of characteristics similar to a ventral stream structure. It was discovered that V5/MT+ activity was modulated both by presence of perceptual motion and emotional content of an image. In addition, this region displayed patterns of functional connectivity with the amygdala that were significantly modulated by emotion. Together, these results suggest that emotional information modulates neural processing within ventral sensory processing streams, but not dorsal processing streams. These findings are discussed with respect to current models of emotional and sensory processing, including amygdala connections to sensory cortices and emotional effects on cognition and behaviour

    Feature-Specific Patterns of Attention and Functional Connectivity in Human Visual Cortex

    Get PDF
    The ability to successfully allocate attention to a particular space or feature in the visual world is vital for successful day-to-day functioning. Attention refers to a narrowing of focus, with increased processing of an attended attribute at the expense of other non-attended dimensions. This attentional mechanism can modulate activity in the visual cortex and beyond. However, the full range of spatial scales at which attentional effects are evident in the visual cortex as a function of task is still relatively little understood. This thesis aimed to investigate the effects of attentional modulation across the visual cortex at several spatial scales, examining activation at the level of mean activity in individual regions-of-interest (ROIs), comparing patterns of voxel-level activity, and employing connectivity-style approaches to examine communication between multiple visual areas simultaneously

    On the Physiology of Bistable Percepts

    Get PDF
    Binocular rivalry refers to the alternating perceptions experienced when two dissimilar patterns are stereoscopically viewed. To study the neural mechanism that underlies such competitive interactions, single cells were recorded in the visual areas V1, V2, and V4, while monkeys reported the perceived orientation of rivaling sinusoidal grating patterns. A number of neurons in all areas showed alternating periods of excitation and inhibition that correlated with the perceptual dominance and suppression of the cell"s preferred orientation. The remaining population of cells were not influenced by whether or not the optimal stimulus orientation was perceptually suppressed. Response modulation during rivalry was not correlated with cell attributes such as monocularity, binocularity, or disparity tuning. These results suggest that the awareness of a visual pattern during binocular rivalry arises through interactions between neurons at different levels of visual pathways, and that the site of suppression is unlikely to correspond to a particular visual area, as often hypothesized on the basis of psychophysical observations. The cell-types of modulating neurons and their overwhelming preponderance in higher rather than in early visual areas also suggests -- together with earlier psychophysical evidence -- the possibility of a common mechanism underlying rivalry as well as other bistable percepts, such as those experienced with ambiguous figures

    Science of Facial Attractiveness

    Get PDF

    Varieties of Attractiveness and their Brain Responses

    Get PDF

    Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations

    Get PDF
    Kayser S, Philiastides MG, Kayser C. Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage. 2017;148:31-41

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task
    • 

    corecore