7,982 research outputs found

    Multisensory Congruency as a Mechanism for Attentional Control over Perceptual Selection

    Get PDF
    The neural mechanisms underlying attentional selection of competing neural signals for awareness remains an unresolved issue. We studied attentional selection, using perceptually ambiguous stimuli in a novel multisensory paradigm that combined competing auditory and competing visual stimuli. We demonstrate that the ability to select, and attentively hold, one of the competing alternatives in either sensory modality is greatly enhanced when there is a matching cross-modal stimulus. Intriguingly, this multimodal enhancement of attentional selection seems to require a conscious act of attention, as passively experiencing the multisensory stimuli did not enhance control over the stimulus. We also demonstrate that congruent auditory or tactile information, and combined auditory–tactile information, aids attentional control over competing visual stimuli and visa versa. Our data suggest a functional role for recently found neurons that combine voluntarily initiated attentional functions across sensory modalities. We argue that these units provide a mechanism for structuring multisensory inputs that are then used to selectively modulate early (unimodal) cortical processing, boosting the gain of task-relevant features for willful control over perceptual awareness

    Spatially valid proprioceptive cues improve the detection of a visual stimulus

    Get PDF
    Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant’s arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask. The d′ detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible. These results suggest that proprioception influences the allocation of attention in visual spac

    Out of focus – brain attention control deficits in adult ADHD

    Get PDF
    Modern environments are full of information, and place high demands on the attention control mechanisms that allow the selection of information from one (focused attention) or multiple (divided attention) sources, react to changes in a given situation (stimulus-driven attention), and allocate effort according to demands (task-positive and task-negative activity). We aimed to reveal how attention deficit hyperactivity disorder (ADHD) affects the brain functions associated with these attention control processes in constantly demanding tasks. Sixteen adults with ADHD and 17 controls performed adaptive visual and auditory discrimination tasks during functional magnetic resonance imaging (fMRI). Overlapping brain activity in frontoparietal saliency and default-mode networks, as well as in the somato-motor, cerebellar, and striatal areas were observed in all participants. In the ADHD participants, we observed exclusive activity enhancement in the brain areas typically considered to be primarily involved in other attention control functions: During auditory-focused attention, we observed higher activation in the sensory cortical areas of irrelevant modality and the default-mode network (DMN). DMN activity also increased during divided attention in the ADHD group, in turn decreasing during a simple button-press task. Adding irrelevant stimulation resulted in enhanced activity in the salience network. Finally, the irrelevant distractors that capture attention in a stimulus-driven manner activated dorsal attention networks and the cerebellum. Our findings suggest that attention control deficits involve the activation of irrelevant sensory modality, problems in regulating the level of attention on demand, and may encumber top-down processing in cases of irrelevant information. (C) 2018 Elsevier B.V. All rights reserved.Peer reviewe

    The role of the right temporoparietal junction in perceptual conflict: detection or resolution?

    Get PDF
    The right temporoparietal junction (rTPJ) is a polysensory cortical area that plays a key role in perception and awareness. Neuroimaging evidence shows activation of rTPJ in intersensory and sensorimotor conflict situations, but it remains unclear whether this activity reflects detection or resolution of such conflicts. To address this question, we manipulated the relationship between touch and vision using the so-called mirror-box illusion. Participants' hands lay on either side of a mirror, which occluded their left hand and reflected their right hand, but created the illusion that they were looking directly at their left hand. The experimenter simultaneously touched either the middle (D3) or the ring finger (D4) of each hand. Participants judged, which finger was touched on their occluded left hand. The visual stimulus corresponding to the touch on the right hand was therefore either congruent (same finger as touch) or incongruent (different finger from touch) with the task-relevant touch on the left hand. Single-pulse transcranial magnetic stimulation (TMS) was delivered to the rTPJ immediately after touch. Accuracy in localizing the left touch was worse for D4 than for D3, particularly when visual stimulation was incongruent. However, following TMS, accuracy improved selectively for D4 in incongruent trials, suggesting that the effects of the conflicting visual information were reduced. These findings suggest a role of rTPJ in detecting, rather than resolving, intersensory conflict

    Early cross-modal interactions and adult human visual cortical plasticity revealed by binocular rivalry

    Get PDF
    In this research binocular rivalry is used as a tool to investigate different aspects of visual and multisensory perception. Several experiments presented here demonstrated that touch specifically interacts with vision during binocular rivalry and that the interaction likely occurs at early stages of visual processing, probably V1 or V2. Another line of research also presented here demonstrated that human adult visual cortex retains an unexpected high degree of experience-dependent plasticity by showing that a brief period of monocular deprivation produced important perceptual consequences on the dynamics of binocular rivalry, reflecting a homeostatic plasticity. In summary, this work shows that binocular rivalry is a powerful tool to investigate different aspects of visual perception and can be used to reveal unexpected properties of early visual cortex

    Are Neuronal Mechanisms of Attentional Modulation Universal Across Human Sensory and Motor Brain Maps?

    Get PDF
    One\u27s experience of shifting attention from the color to the smell to the act of picking a flower seems like a unitary process applied, at will, to one modality after another. Yet, the unique experience of sight vs smell vs movement might suggest that the neural mechanisms of attention have been selectively optimized to employ each modality to greatest advantage. Relevant experimental data can be difficult to compare across modalities due to design and methodological heterogeneity. Here we outline some of the issues related to this problem and suggest how experimental data can be obtained across modalities using more uniform methods and measurements. The ultimate goal is to spur efforts across disciplines to provide a large and varied database of empirical observations that will either support the notion of a universal neural substrate for attention or more clearly identify to what degree attentional mechanisms are specialized for each modality

    Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans

    Get PDF
    Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception
    corecore